# zbMATH — the first resource for mathematics

$$(h,\Psi)$$-entropy differential metric. (English) Zbl 0898.62005
Considered are monotone functions of integrals of concave or convex functions of probability densities, called generalized entropies. An example is the class of Rényi entropies. For families of parametrized densities, the Hessians along the directions of tangents of the parameter space define differential metrics of a Riemannian geometry. The corresponding second order covariant tensors depend on the two functions figuring in the definition of entropy. For the class of Rényi entropies these tensors are explicitly calculated when the family of densities is exponential, e.g. Bernoulli, geometric, Pareto, Erlang, etc. Asymptotic normality of the tensors is established under appropriate regularity of the entropy and the family of densities.
Reviewer: I.Vajda (Praha)

##### MSC:
 62B10 Statistical aspects of information-theoretic topics 62E20 Asymptotic distribution theory in statistics 53B20 Local Riemannian geometry
Full Text:
##### References:
 [1] S. I. Amari: A foundation of information geometry. vol. 66-A, , 1983, pp. 1-10. [2] C. Atkinson, A. F. S. Mitchell: Rao’s distance measure. vol. 43, , 1981, pp. 345-365. · Zbl 0534.62012 [3] S. Arimoto: Information-theoretical considerations on estimation problems. Information and Control 19 (1971), no. , , 181-194. · Zbl 0222.94022 [4] J. Burbea: Informative geometry of probability spaces. vol. 4, , 1986, pp. 347-378. · Zbl 0604.62006 [5] J. Burbea, C. R. Rao: Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. J. Multivariate Analysis 12 (1982a), no. , , 575-596. · Zbl 0526.60015 [6] J. Burbea, C. R. Rao: On the convexity of some divergence measures based on entropy functions. vol. IT-28, , 1982b, pp. 489-495. · Zbl 0479.94009 [7] J. Burbea, C. R. Rao: On the convexity of higher order Jensen differences based on entropy functions. vol. IT-28, , 1982c, pp. 961-963. · Zbl 0497.94002 [8] N. N. Cencov: Statistical Decision Rules and Optimal Inference. vol. , , 1982, pp. . [9] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. vol. 2, , 1967, pp. 299-318. · Zbl 0157.25802 [10] K. Ferentinos, T. Papaioannou: New parametric measures of information. vol. 51, , 1981, pp. 193-208. · Zbl 0524.62005 [11] C. Ferreri: Hypoentropy and related heterogeneity divergence measures. vol. 40, , 1980, pp. 55-118. · Zbl 0454.62004 [12] J. Havrda, F. Charvat: Concept of structural $$\alpha$$-entropy. vol. 3, , 1967, pp. 30-35. [13] D. Morales, L. Pardo, L. Salicrú, M. L. Menéndez: New parametric measures of information based on generalized $$R$$-divergences. vol. , , 1993, pp. 473-488. · Zbl 0810.62011 [14] R. J. Muirhead: Aspect of Multivariate Statistical Theory. vol. , , 1982, pp. . [15] O. Onicescu: Energie Informationnelle. vol. 263, , 1966, pp. 841-842. · Zbl 0143.41206 [16] C. R. Rao: Information and accuracy attainable in the estimation of statistical parameters. vol. 37, , 1945, pp. 81-91. · Zbl 0063.06420 [17] C. R. Rao: Differential Metrics in probability spaces. vol. , , 1987, pp. . [18] A. Rényi: On measures of entropy and information. vol. 1, , 1961, pp. 547-561. [19] M. Salicrú, M. L. Menéndez, D. Morales, L. Pardo: Asymptotic distribution of $$(h,\Phi )$$-entropies. vol. 22(7), , 1993, pp. 2015-2031. · Zbl 0791.62005 [20] C. E. Shannon: A mathematical theory of communication. vol. 27, , 1948, pp. 379-423. · Zbl 1154.94303 [21] B. D. Sharma, I. J. Taneja: Entropy of type $$(\alpha , \beta )$$ and other generalized measures in information theory. vol. 22, , 1975, pp. 205-215. · Zbl 0328.94012 [22] B. D. Sharma, P. Mittal: New non-additive measures of relative information. vol. 2, , 1975, pp. 122-133. · Zbl 0439.94006 [23] I. J. Taneja: A study of generalized measures in information theory. vol. , , 1975, pp. . [24] I. J. Taneja: On generalized information measures and their applications. vol. 76, , 1989, pp. 327-413. [25] I. Vajda, K. Vašek: Majorization, concave entropies and comparison of experiments. vol. 14, , 1985, pp. 105-115. · Zbl 0601.62006 [26] J. C. A. Van der Lubbe: $$R$$-norm information and a general class of measures for certainty and information. M. Sc. Thesis, Delf University of Technology, Dept. E.E., (1977), no. , , . [27] J. C. A. Van der Lubbe: A generalized probabilistic theory of the measurement of certainty and information. Ph. D. Thesis, Delf University of Technology, Dept. E.E., (1981), no. , , . [28] R. S. Varma: Generalizations of Renyi’s entropy of order $$\alpha$$. vol. 1, , 1966, pp. 34-48. · Zbl 0166.15401
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.