Rényi divergence and the central limit theorem. (English) Zbl 1466.60065

Summary: We explore properties of the \(\chi^{2}\) and Rényi distances to the normal law and in particular propose necessary and sufficient conditions under which these distances tend to zero in the central limit theorem (with exact rates with respect to the increasing number of summands).


60F15 Strong limit theorems
60E05 Probability distributions: general theory
62B10 Statistical aspects of information-theoretic topics
Full Text: DOI arXiv Euclid


[1] Amosova, N. N. (1990). Narrow zones of local normal attraction. Teor. Veroyatn. Primen.35 138-143. Translation in: Theory Probab. Appl.35 (1990) 140-145 (1991). · Zbl 0701.60018
[2] Amosova, N. N. (1990). A remark on a local limit theorem for large deviations. Teor. Veroyatn. Primen.35 754-756. Translation in: Theory Probab. Appl.35 (1990), 758-760 (1991). · Zbl 0776.60034
[3] Artstein, S., Ball, K. M., Barthe, F. and Naor, A. (2004). On the rate of convergence in the entropic central limit theorem. Probab. Theory Related Fields129 381-390. · Zbl 1055.94004 · doi:10.1007/s00440-003-0329-4
[4] Artstein, S., Ball, K. M., Barthe, F. and Naor, A. (2004). Solution of Shannon’s problem on the monotonicity of entropy. J. Amer. Math. Soc.17 975-982. · Zbl 1062.94006 · doi:10.1090/S0894-0347-04-00459-X
[5] Bally, V. and Caramellino, L. (2016). Asymptotic development for the CLT in total variation distance. Bernoulli22 2442-2485. · Zbl 1346.60016 · doi:10.3150/15-BEJ734
[6] Barron, A. R. (1986). Entropy and the central limit theorem. Ann. Probab.14 336-342. · Zbl 0599.60024 · doi:10.1214/aop/1176992632
[7] Bhattacharya, R. N. and Ranga Rao, R. (1976). Normal Approximation and Asymptotic Expansions. Wiley, New York. · Zbl 0331.41023
[8] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2011). Non-uniform bounds in local limit theorems in case of fractional moments. I. Math. Methods Statist.20 171-191. · Zbl 1239.60016 · doi:10.3103/S106653071103001X
[9] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2013). Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. Ann. Probab.41 2479-2512. · Zbl 1296.60051 · doi:10.1214/12-AOP780
[10] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2014). Berry-Esseen bounds in the entropic central limit theorem. Probab. Theory Related Fields159 435-478. · Zbl 1307.60011 · doi:10.1007/s00440-013-0510-3
[11] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2014). Fisher information and the central limit theorem. Probab. Theory Related Fields159 1-59. · Zbl 1372.60018 · doi:10.1007/s00440-013-0500-5
[12] Bobkov, S. G., Chistyakov, G. P. and Kösters, H. (2015). The entropic Erdös-Kac limit theorem. J. Theoret. Probab.28 1520-1555. · Zbl 1335.60018 · doi:10.1007/s10959-014-0550-3
[13] Bobkov, S. G. and Götze, F. (1999). Exponential integrability and transportation cost related to logarithmic Sobolev inequalities. J. Funct. Anal.163 1-28. · Zbl 0924.46027 · doi:10.1006/jfan.1998.3326
[14] Bobkov, S. G., Houdré, C. and Tetali, P. (2006). The subgaussian constant and concentration inequalities. Israel J. Math.156 255-283. · Zbl 1134.60016
[15] Borland, L., Plastino, A. R. and Tsallis, C. (1998). Information gain within nonextensive thermostatistics. J. Math. Phys.39 6490-6501. · Zbl 0938.82001 · doi:10.1063/1.532660
[16] Carlen, E. A. (1991). Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. J. Funct. Anal.101 194-211. · Zbl 0732.60020 · doi:10.1016/0022-1236(91)90155-X
[17] Cramér, H. (1925). On some classes of series used in mathematical statistics. In Proc. 6th Scand. Math. Congr. Copenhagen 399-425. Also: Harald Cramér (1994) Collected Works, Vol. I (A. Martin-Löf, ed.) 438-464. Springer, Berlin.
[18] Csiszár, I. (1967). Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar.2 299-318. · Zbl 0157.25802
[19] Dembo, A., Cover, T. M. and Thomas, J. A. (1991). Information-theoretic inequalities. IEEE Trans. Inform. Theory37 1501-1518. · Zbl 0741.94001 · doi:10.1109/18.104312
[20] Fomin, S. V. (1982). The central limit theorem: Convergence in the norm \(\|u\|=(∫_{-∞}^∞u^2(x)e^{x^2/2}\,dx)^{1/2}\). Zap. Nauchn. Sem. Leningrad. Otdel. Mat. Inst. Steklov. (LOMI) 119 218-229, 242, 245. Problems of the theory of probability distribution, VII. · Zbl 0495.60031
[21] Gibbs, A. L. and Su, F. E. (2002). On choosing and bounding probability metrics. Int. Stat. Rev.70 419-435. · Zbl 1217.62014 · doi:10.1111/j.1751-5823.2002.tb00178.x
[22] Gilardoni, G. L. (2010). On Pinsker’s and Vajda’s type inequalities for Csiszár’s \(f\)-divergences. IEEE Trans. Inform. Theory56 5377-5386. · Zbl 1366.94185 · doi:10.1109/TIT.2010.2068710
[23] Hirschman, I. I. and Widder, D. V. (1955). The Convolution Transform. Princeton Univ. Press, Princeton, NJ. · Zbl 0039.33202 · doi:10.1080/00029890.1950.11999602
[24] Ibragimov, I. A. and Linnik, Yu. V. (1971). Independent and Stationary Sequences of Random Variables. Wolters-Noordhoff Publishing, Groningen. · Zbl 0219.60027
[25] Johnson, O. (2004). Information Theory and the Central Limit Theorem. Imperial College Press, London. · Zbl 1061.60019
[26] Johnson, O. and Barron, A. (2004). Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields129 391-409. · Zbl 1047.62005 · doi:10.1007/s00440-004-0344-0
[27] Kullback, S. (1967). A lower bound for discrimination in terms of variation. IEEE Trans. Inform. Theory13 126-127.
[28] Kullback, S. and Leibler, R. A. (1951). On information and sufficiency. Ann. Math. Stat.22 79-86. · Zbl 0042.38403 · doi:10.1214/aoms/1177729694
[29] Le Cam, L. (1986). Asymptotic Methods in Statistical Decision Theory. Springer, New York. · Zbl 0605.62002
[30] Lieb, E. H. (1975). Some convexity and subadditivity properties of entropy. Bull. Amer. Math. Soc.81 1-13. · Zbl 0308.94020 · doi:10.1090/S0002-9904-1975-13621-4
[31] Liese, F. and Vajda, I. (1987). Convex Statistical Distances. Teubner-Texte zur Mathematik [Teubner Texts in Mathematics] 95. BSB B. G. Teubner Verlagsgesellschaft, Leipzig. · Zbl 0656.62004
[32] Linnik, Ju. V. (1959). An information-theoretic proof of the central limit theorem with Lindeberg conditions. Theory Probab. Appl.4 288-299. · Zbl 0097.13103 · doi:10.1137/1104028
[33] Madiman, M. and Barron, A. (2007). Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory53 2317-2329. · Zbl 1326.94034 · doi:10.1109/TIT.2007.899484
[34] Nielsen, F. (2014). On the Chi square and higher-order Chi distances for approximating \(f\)-divergences. IEEE Signal Process. Lett.21 10-13.
[35] Otto, F. and Villani, C. (2000). Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality. J. Funct. Anal.173 361-400. · Zbl 0985.58019 · doi:10.1006/jfan.1999.3557
[36] Petrov, V. V. (1964). Local limit theorems for sums of independent random variables. Teor. Veroyatn. Primen.9 343-352. · Zbl 0146.38003
[37] Petrov, V. V. (1975). Sums of Independent Random Variables. Ergebnisse der Mathematik und ihrer Grenzgebiete82. Springer, New York. · Zbl 0322.60042
[38] Pinsker, M. S. (1964). Information and Information Stability of Random Variables and Processes. Holden-Day, Inc., San Francisco, CA. Translated and edited by Amiel Feinstein. · Zbl 0125.09202
[39] Prokhorov, Yu. V. (1952). A local theorem for densities. Dokl. Akad. Nauk SSSR (N.S.) 83 797-800. · Zbl 0046.35301
[40] Rényi, A. (1961). On measures of entropy and information. In Proc. 4th Berkeley Sympos. Math. Statist. and Prob., Vol. I 547-561. Univ. California Press, Berkeley, CA. · Zbl 0106.33001
[41] Shiryaev, A. N. (1996). Probability, 2nd ed. Graduate Texts in Mathematics95. Springer, New York. Translated from the first (1980) Russian edition by R. P. Boas. · Zbl 0508.60001
[42] Siraždinov, S. H. and Mamatov, M. (1962). On mean convergence for densities. Theory Probab. Appl.7 424-428. · Zbl 0302.60015
[43] Szegö, G. Orhtogonal Polynomials, 3rd ed. American Math. Soc. Publications23. Amer. Math. Soc., Providence, RI.
[44] Toscani, G. (2016). Entropy inequalities for stable densities and strengthened central limit theorems. J. Stat. Phys.165 371-389. · Zbl 1356.60039 · doi:10.1007/s10955-016-1619-4
[45] Toscani, G. (2016). The fractional Fisher information and the central limit theorem for stable laws. Ric. Mat.65 71-91. · Zbl 1366.60064 · doi:10.1007/s11587-015-0253-9
[46] Tsallis, C. (1998). Generalized entropy-biased criterion for consistent testing. Phys. Rev. E58 1442-1445.
[47] Tulino, A. M. and Verdú, S. (2006). Monotonic decrease of the non-Gaussianness of the sum of independent random variables: A simple proof. IEEE Trans. Inform. Theory52 4295-4297. · Zbl 1320.60111 · doi:10.1109/TIT.2006.880066
[48] Vajda, I. (1989). Theory of Statistical Inference and Information. Kluwer Academic, Dordrecht. · Zbl 0711.62002
[49] van Erven, T. · Zbl 1360.94180 · doi:10.1109/TIT.2014.2320500
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.