×

Stam inequality on \(\mathbb Z_n\). (English) Zbl 1213.94057

Summary: We prove a discrete version of Stam inequality for random variables taking values on \(\mathbb Z_n\).

MSC:

94A17 Measures of information, entropy
62B10 Statistical aspects of information-theoretic topics
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Blachman, N. M., The convolution inequality for entropy powers, IEEE Trans. Inform. Theory, IT-11, 267-271 (1965) · Zbl 0134.37401
[2] Carlen, E., Superadditivity of Fisher’s information and logarithmic Sobolev inequalities, J. Funct. Anal., 101, 194-211 (1991) · Zbl 0732.60020
[3] Harremoes, P., Binomial and Poisson distribution as maximum entropy distributions, IEEE Trans. Inform. Theory, 47, 2039-2041 (2001) · Zbl 0999.94012
[4] Johnson, O. T., Log-concavity and the maximum entropy property of the Poisson distribution, Stochastic Process Appl., 117, 791-802 (2007) · Zbl 1115.60012
[5] Johnstone, I. M.; MacGibbon, B., Une mesure d’information caractérisant la loi de Poisson, (Séminaire de Probabilités, XXI. Séminaire de Probabilités, XXI, Lecture Notes in Math., vol. 1247 (1987), Springer: Springer Berlin), 563-573 · Zbl 0621.60028
[6] Kagan, A.; Landsman, Z., Statistical meaning of Carlen’s superadditivity of the Fisher information, Statist. Probab. Lett., 32, 175-179 (1997) · Zbl 0874.60002
[7] Kagan, A., A discrete version of Stam inequality and a characterization of the Poisson distribution, J. Statist. Plann. Inference, 92, 7-12 (2001) · Zbl 0964.62013
[8] Kagan, A., Letter to the editor: “A discrete version of Stam inequality and a characterization of the Poisson distribution”, J. Statist. Plann. Inference, 92, 7-12 (2001), J. Statist. Plann. Inference 99, 1 · Zbl 0964.62013
[9] Kontoyiannis, I.; Harremoës, P.; Johnson, O., Entropy and the law of small numbers, IEEE Trans. Inform. Theory, 51, 466-472 (2005) · Zbl 1297.94016
[10] Madiman, M.; Barron, R. A., Generalized entropy power inequalities and monotonicity properties of information, IEEE Trans. Inform. Theory, 53, 2317-2329 (2007) · Zbl 1326.94034
[11] Madiman, M., Johnson, O., Kontoyiannis, I., 2007. Fisher information, compound Poisson approximation and the Poisson channel. In: Proc. IEEE Intl. Symp. Inform. Theory, Nice, France; Madiman, M., Johnson, O., Kontoyiannis, I., 2007. Fisher information, compound Poisson approximation and the Poisson channel. In: Proc. IEEE Intl. Symp. Inform. Theory, Nice, France
[12] Papathanasiou, V., Some characteristic properties of the Fisher information matrix via Cacoullo-type inequalities, J. Multivariate Anal., 44, 256-265 (1993) · Zbl 0765.62055
[13] Stam, A. J., Some inequalities satisfied by the quantities of information of Fisher and Shannon, Inform. Control, 2, 101-112 (1959) · Zbl 0085.34701
[14] Villani, C., Cercignani’s conjecture is sometimes true and always almost true, Comm. Math. Phys., 234, 455-490 (2003) · Zbl 1041.82018
[15] Voiculescu, D., The analogues of entropy and of Fisher’s information measure in free probability theory. V. Noncommutative Hilbert transforms, Invent. Math., 132, 189-227 (1998) · Zbl 0930.46053
[16] Zamir, R., A proof of the Fisher information inequality via a data processing argument, IEEE Trans. Inform. Theory, 44, 1246-1250 (1998) · Zbl 0901.62005
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.