On characterization of directed divergence of type \(\beta\) through information equation. (English) Zbl 0548.94009

The authors are trying to solve the equations \[ (1.3)\quad I\begin{pmatrix} x,y,z \\ l,m,n \end{pmatrix} = I\begin{pmatrix} x+y,0,z \\ l+m,0,n \end{pmatrix} + I\begin{pmatrix} x,y,0 \\ l,m,0 \end{pmatrix} \] and \[ (1.5)\quad I\begin{pmatrix} \lambda x,\lambda y,\lambda z \\ \mu l,\mu m,\mu n \end{pmatrix} = \lambda^{\beta}\mu^{1-\beta} I\begin{pmatrix} x,y,z \\ l,m,n \end{pmatrix}, \] \(\lambda,\mu >0\), \(\beta >0\) and \(\neq 1\) on \((*)\quad D^ 2=\{(x,y,z;l,m,n):x,y,z,l,m,n\geq 0\) with \(xy+yz+zx>0\), \(lm+mn+nl>0\}\) under the assumption of symmetry of I.
Reviewer’s remarks. The authors are not clear about the domain and the notations they are using, for example in (2.13), the authors find the value of \(I\begin{pmatrix} x,0,0 \\ l,0,0 \end{pmatrix}\) which does not satisfy (*). Also the authors define f (in 2.14) on \([0,1]\times [0,1]\) in terms of I and use \(f(1,1)=f(0,0)\) (which involves \(I\begin{pmatrix} 0,1,0 \\ 0,2,0 \end{pmatrix}\) etc., which are not defined at all) to determine I. It is also not made clear, why I satisfying (1.3) and (1.5) should be of the form (2.5) which involves the function \(I^{\beta}_{n}\) satisfying the postulates 1 to 4. Thus, the authors have not determined all the solutions of (1.3) and (1.5) under symmetry. Further, the reviewer with P. N. Rathie has characterized \(I_ n^{\beta}\) satisfying the postulates 1 to 4 (even weaker in case of postulate 2) using a functional equation [Inf. Control 20, 38-45 (1972; Zbl 0231.94015)]. The reviewer (joint with Kamiński and Mikusiński) has characterized directed divergence, inaccuracy, generalized directed divergence by using equations of the form (1.3), (1.5) and (2.16) [A. Kamiński, the reviewer and J. Mikusiński, Ann. Pol. Math. 36, 101-110 (1979; Zbl 0404.94004); the reviewer and A. Kamiński, Bull. Acad. Pol. Sci., Sér. Sci. Math. Astron. Phys. 25, 925-928 (1977; Zbl 0365.94028) and Ann. Pol. Math. 38, 289-294 (1980; Zbl 0449.94008), which are all missing as references].
Reviewer: Pl.Kannappan


94A17 Measures of information, entropy
Full Text: EuDML


[1] J. Aczel: Results on entropy equation. Bull. Acad. Polon. Sci. Sér. Sci. Math. 25 (1977), 12-17.
[2] L. L. Campbell: Characterization of Entropy in Arbitrary Probability Spaces. Preprint No. 1970-32, Queen’s University, Kingston, Ontario 1970.
[3] D. G. Gallager: Information Theory and Reliable Communication. John Wiley and Sons, New York 1968, 523-524. · Zbl 0198.52201
[4] A. Kamiński, J. Mikusiński: On the entropy equation. Bull. Acad. Polon. Sci. Sér. Sci. Math. 22 (1974), 319-323.
[5] S. Kullback: Information Theory and Statistics. John Wiley and Sons, New York 1959. · Zbl 0088.10406
[6] P. L. Kannappan, P. N. Rathie: An application of a functional equation to information theory. Ann. Polon. Math. XXVI (1972). · Zbl 0235.39001
[7] D. F. Kerridge (1961): Inaccuracy and inference. J. Roy. Statist. Soc. Ser. A 23 (1961), 184-194. · Zbl 0112.10302
[8] P. N. Rathie, P. L. Kannappan: A directed divergence function of type \(\beta\). Inform. and Control 20 (1972), 38-45. · Zbl 0231.94015
[9] A. Rényi: On measures of entropy and information. Proc. Fourth Berkeley Symposium Math. Statist. and Probability 1 (1961), 547-561.
[10] B. D. Sharma, Ram Autar: Relative information functions and their type \((\alpha, \beta)\) generalizations. Metrika.27 (1974), 41-50. · Zbl 0277.94012
[11] B. D. Sharma, R. S. Soni: A new generalized functional equation for inaccuracy and entropy of kind \(\beta\). Funkcial. Ekvac. 17 (1974), 1-11. · Zbl 0312.94009
[12] H. Theil: Economics and Information Theory. North-Holland, Amsterdam 1967.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.