# zbMATH — the first resource for mathematics

Entropy and the central limit theorem. (English) Zbl 0599.60024
The author establishes a strengthened central limit theorem for densities showing monotone convergence in the sense of relative entropy. The relative entropy is defined by $D_ n=\int f(x) \log f(x)/\phi (x)dx$ where f is the density function of a random variable, X, with finite variance and $$\phi$$ is the normal density with the same mean and variance as f.
Reviewer: L.Pardo

##### MSC:
 60F05 Central limit and other weak theorems 94A17 Measures of information, entropy 62B10 Statistical aspects of information-theoretic topics
##### Keywords:
Fisher information; central limit theorem; relative entropy
Full Text: