×

zbMATH — the first resource for mathematics

Entropy and effective support size. (English) Zbl 1135.94316
Summary: Notion of Effective size of support (Ess) of a random variable is introduced. A small set of natural requirements that a measure of Ess should satisfy is presented. The measure with prescribed properties is in a direct (exp-) relationship to the family of Rényi’s - entropies which includes also Shannon’s entropy \(H\). Considerations of choice of the value of imply that exp(\(H\)) appears to be the most appropriate measure of Ess. Entropy and Ess can be viewed thanks to their log/exp relationship as two aspects of the same thing. In Probability and Statistics the Ess aspect could appear more basic than the entropic one.

MSC:
94A17 Measures of information, entropy
60A10 Probabilistic measure theory
62B10 Statistical aspects of information-theoretic topics
PDF BibTeX XML Cite
Full Text: DOI
References:
[2] Cover, Elements of Information Theory pp 224– (1991)
[4] DOI: 10.1103/PhysRev.106.620 · Zbl 0084.43701 · doi:10.1103/PhysRev.106.620
[5] Khinchin, Mathematical Foundations of Information Theory pp 9– (1957) · Zbl 0088.10404
[6] DOI: 10.1002/j.1538-7305.1948.tb01338.x · Zbl 1154.94303 · doi:10.1002/j.1538-7305.1948.tb01338.x
[7] DOI: 10.3390/e3010001 · Zbl 1004.94009 · doi:10.3390/e3010001
[8] DOI: 10.1016/S0378-3758(00)00169-5 · Zbl 0997.62003 · doi:10.1016/S0378-3758(00)00169-5
[9] DOI: 10.1109/TIT.1978.1055832 · Zbl 0371.94042 · doi:10.1109/TIT.1978.1055832
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.