×

A generalization of the Havrda-Charvat and Tsallis entropy and its axiomatic characterization. (English) Zbl 1472.94038

Summary: In this communication, we characterize a measure of information of types \(\alpha, \beta\), and \(\gamma\) by taking certain axioms parallel to those considered earlier by J. Havrda and F. Charvát [Kybernetika 3, 30–35 (1967; Zbl 0178.22401)] along with the recursive relation \(H_n(p_1,\dots,p_n;\alpha,\beta,\gamma)-H_{n-1}(p_1+p_2,p_3,\dots,p_n; \alpha,\beta,\gamma)=(A_{(\alpha,\gamma)}/(A_{(\alpha,\gamma)}-A_{(\beta,\gamma)}))(p_1+p_2)^{\alpha/\gamma}H_2(p_1/(p_1+p_2), p_2/(p_1+p_2);\alpha,\gamma)+(A_{(\beta,\gamma)}/(A_{(\beta,\gamma)}-A_{(\alpha,\gamma)}))(p_1+p_2)^{(\beta/\gamma)}H_2(p_1/(p_1+p_2)\), \(p_2/(p_1+p_2);\gamma,\beta)\), \(\alpha\neq\gamma\neq\beta\), \(\alpha,\beta,\gamma>0\). Some properties of this measure are also studied. This measure includes Shannon’s information measure as a special case.

MSC:

94A17 Measures of information, entropy
94A20 Sampling theory in information and communication theory

Citations:

Zbl 0178.22401
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Aczél, J.; Daróczy, Z., On Measures of Information and Their Characterization (1975), New York, NY, USA: Academic Press, New York, NY, USA · Zbl 0345.94022
[2] Faddeev, D. K., On the concept of entropy of a finite probabilistic scheme, Uspekhi Matematicheskikh Nauk, 11, 1(67), 227-231 (1956) · Zbl 0071.13103
[3] Chaundy, T. W.; McLeod, J. B., On a functional equation, Proceedings of the Edinburgh Mathematical Society. Series II, 12, 43, 6-7 (1960) · Zbl 0100.32703
[4] Sharma, B. D.; Taneja, I. J., Functional measures in information theory, Funkcialaj Ekvacioj, 17, 181-191 (1974) · Zbl 0316.94021
[5] Shannon, C. E., A mathematical theory of communication, The Bell System Technical Journal, 27, 379-423, 623-636 (1948) · Zbl 1154.94303
[6] Havrda, J.; Charvát, F., Quantification method of classification processes. Concept of structural \(\alpha \)-entropy, Kybernetika, 3, 30-35 (1967) · Zbl 0178.22401
[7] Daróczy, Z., Generalized information functions, Information and Computation, 16, 36-51 (1970) · Zbl 0205.46901
[8] Tsallis, C., Possible generalization of Boltzmann-Gibbs statistics, Journal of Statistical Physics, 52, 1-2, 479-487 (1988) · Zbl 1082.82501 · doi:10.1007/BF01016429
[9] Hanel, R.; Thurner, S., A comprehensive classification of complex statistical systems and an ab-initio derivation of their entropy and distribution functions, Europhysics Letters, 93, 2 (2011) · doi:10.1209/0295-5075/93/20006
[10] Hanel, R.; Thurner, S.; Gell-Mann, M., Generalized entropies and logarithms and their duality relations, Proceedings of the National Academy of Sciences of the United States of America, 109, 47, 19151-19154 (2012) · doi:10.1073/pnas.1216885109
[11] Suyari, H., Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy, IEEE Transactions on Information Theory, 50, 8, 1783-1787 (2004) · Zbl 1298.94040 · doi:10.1109/TIT.2004.831749
[12] IIic, V. M.; Stankovic, M. S.; Mulalic, E. H., Comments on Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for nonextensive entropy, IEEE Transactions on Information Theory, 59, 10, 6950-6952 (2013) · doi:10.1109/TIT.2013.2259958
[13] Vajda, I., Axioms for \(\alpha \)-entropy of a generalized probability scheme, Kybernetika, 2, 105-112 (1968) · Zbl 0193.48201
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.