×

zbMATH — the first resource for mathematics

On generalized entropies, Bayesian decisions and statistical diversity. (English) Zbl 1143.94006
Let \(X\) be a random discrete variable with distribution \(p = (p(i):i \in {\mathcal I})\) with finite \({\mathcal I}\). The first author [Theory of statistical inference and information. Theory and Decision Library, Series B: Mathematical and Statistical Methods, 11. (Dordrecht) etc.: Kluwer Academic Publishers. (1989; Zbl 0711.62002)] studied \(\psi \) -entropies \(H_\psi (p) \equiv H_\psi (X) = \sum\limits_x {p(x)\psi (p(x))} \) , where \(\psi \) is a decreasing continuous function on \((0,1]\) and \(\psi (1) = 0\). The power entropies \(H_\alpha (X)\) [J. Havrda, F. Charvat, Kybernetika, Praha 3, 30–35(1967; Zbl 0178.22401)] are obtained using for \(\psi \) the power function \(\psi _\alpha (\pi ) = (1 - \pi ^{\alpha - 1} )/(\alpha - 1)\) , \(\alpha \in \mathbb{R}\) , where \[ \psi _\alpha (0) = \mathop {\lim }\limits_{\pi \downarrow 0} \psi _\alpha (\pi ) \] for \(\alpha \neq 1\) and \(\psi _1 (\pi ) = - \ln \pi \) , i.e \(\alpha = 1\) gives the Shannon entropy; another important subcase is the quadratic entropy \(H_2 (X)\). It is shown that generalized entropies of information sources as generalized informations in direct observations lead to nonconcave entropies, in particular infinitely many nonconcave power entropies. Relations between the entropies \(H_\alpha \), \(\alpha \geq 0\) and the errors of Bayesian decisions about \(X\) are investigated; it is shown that the quadratic entropy provides estimates which are in average more than 100 and \(H_2 (X)\) .

MSC:
94A17 Measures of information, entropy
62C10 Bayesian problems; characterization of Bayes procedures
PDF BibTeX XML Cite
Full Text: EuDML Link
References:
[1] Cover T., Thomas J.: Elements of Information Theory. Wiley, New York 1991 · Zbl 1140.94001
[2] Cressie N., Read T. R. C.: Multinomial goodness-of-fit tests. J. Roy. Statist. Soc. Ser. B 46 (1984), 440-464 · Zbl 0571.62017
[3] Csiszár I.: Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci. Ser. A 8 (1963), 85-108 · Zbl 0124.08703
[4] Csiszár I.: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299-318 · Zbl 0157.25802
[5] Csiszár I.: A class of measures of informativity of observation channels. Period. Math. Hungar. 2 (1972), 191-213 · Zbl 0247.94018
[6] Dalton H.: The Inequality of Incomes. Ruthledge & Keagan Paul, London 1925
[7] Devijver P., Kittler J.: Pattern Recognition: A Statistical Approach. Prentice Hall, Englewood Cliffs, NJ 1982 · Zbl 0542.68071
[8] Devroy L., Györfi, L., Lugosi G.: A Probabilistic Theory of Pattern Recognition. Springer, Berlin 1996 · Zbl 0853.68150
[9] Emlen J. M.: Ecology: An Evolutionary Approach. Adison-Wesley, Reading 1973
[10] Gini C.: Variabilitá e Mutabilitá. Studi Economico-Giuridici della R. Univ. di Cagliari. 3 (1912), Part 2, p. 80
[11] Harremöes P., Topsøe F.: Inequalities between etropy and index of concidence. IEEE Trans. Inform. Theory 47 (2001), 2944-2960 · Zbl 1017.94005
[12] Havrda J., Charvát F.: Concept of structural \(a\)-entropy. Kybernetika 3 (1967), 30-35 · Zbl 0178.22401
[13] Höffding W.: Masstabinvariante Korrelationstheorie. Teubner, Leipzig 1940 · JFM 66.0649.02
[14] Höffding W.: Stochastische Abhängigkeit und funktionaler Zusammenhang. Skand. Aktuar. Tidskr. 25 (1942), 200-207 · Zbl 0027.41401
[15] Kovalevskij V. A.: The problem of character recognition from the point of view of mathematical statistics. Character Readers and Pattern Recognition, 3-30. Spartan Books, New York 1967
[16] Liese F., Vajda I.: Convex Statistical Distances. Teubner, Leipzig 1987 · Zbl 0656.62004
[17] Liese F., Vajda I.: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412 · Zbl 1287.94025
[18] Marshall A. W., Olkin I.: Inequalities: Theory of Majorization and its Applications. Academic Press, New York 1979 · Zbl 1219.26003
[19] Morales D., Pardo, L., Vajda I.: Uncertainty of discrete stochastic systems. IEEE Trans. Systems, Man Cybernet. Part A 26 (1996), 681-697
[20] Pearson K.: On the theory of contingency and its relation to association and normal correlation. Drapers Company Research Memoirs, Biometric Ser. 1, London 1904 · JFM 36.0313.12
[21] Perez A.: Information-theoretic risk estimates in statistical decision. Kybernetika 3 (1967), 1-21 · Zbl 0153.48403
[22] Rényi A.: On measures of dependence. Acta Math. Acad. Sci. Hungar. 10 (1959), 441-451 · Zbl 0091.14403
[23] Rényi A.: On measures of entropy and information. Proc. Fourth Berkeley Symposium on Probab. Statist., Volume 1, Univ. Calif. Press, Berkeley 1961, pp. 547-561
[24] Sen A.: On Economic Inequality. Oxford Univ. Press, London 1973
[25] Simpson E. H.: Measurement of diversity. Nature 163 (1949), 688 · Zbl 0032.03902
[26] Tschuprow A.: Grundbegriffe und Grundprobleme der Korrelationstheorie. Berlin 1925 · JFM 51.0392.03
[27] Vajda I.: Bounds on the minimal error probability and checking a finite or countable number of hypotheses. Information Transmission Problems 4 (1968), 9-17
[28] Vajda I.: Theory of Statistical Inference and Information. Kluwer, Boston 1989 · Zbl 0711.62002
[29] Vajda I., Vašek K.: Majorization concave entropies and comparison of experiments. Problems Control Inform. Theory 14 (1985), 105-115 · Zbl 0601.62006
[30] Vajda I., Zvárová J.: On relations between informations, entropies and Bayesian decisions. Prague Stochastics 2006 (M. Hušková and M. Janžura, Matfyzpress, Prague 2006, pp. 709-718
[31] Zvárová J.: On measures of statistical dependence. Čas. pěst. matemat. 99 (1974), 15-29 · Zbl 0282.62048
[32] Zvárová J.: On medical informatics structure. Internat. J. Medical Informatics 44 (1997), 75-81
[33] Zvárová J.: Information Measures of Stochastic Dependence and Diversity: Theory and Medical Informatics Applications. Doctor of Sciences Dissertation, Academy of Sciences of the Czech Republic, Institute of Informatics, Prague 1998
[34] Zvárová J., Mazura I.: Stochastic Genetics (in Czech). Charles University, Karolinum, Prague 2001
[35] Zvárová J., Vajda I.: On genetic information, diversity and distance. Methods of Inform. in Medicine 2 (2006), 173-179
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.