zbMATH — the first resource for mathematics

Fisher information inequalities and the central limit theorem. (English) Zbl 1047.62005
Summary: We give conditions for an \(O(1/n)\) rate of convergence of the Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in \(L^2\) spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of A.R. Barron [Ann. Probab 14, 336–342 (1986; Zbl 0599.60024)] and L.D. Brown [Statistics and Probability. Essays in Honor of C.R. Rao, 141–148 (1982; Zbl 0484.60019)]. We show that if the standardized Fisher information ever becomes finite then it converges to zero.

62B10 Statistical aspects of information-theoretic topics
62F05 Asymptotic properties of parametric tests
94A17 Measures of information, entropy
Full Text: DOI arXiv