×

zbMATH — the first resource for mathematics

A lower bound on the relative entropy with respect to a symmetric probability. (English) Zbl 1320.60056
Summary: Let \(\rho\) and \(\mu\) be two probability measures on \(\mathbb R\) which are not the Dirac mass at 0. We denote by \(H(\mu|\rho)\) the relative entropy of \(\mu\) with respect to \(\rho\). We prove that, if \(\rho\) is symmetric and \(\mu\) has a finite first moment, then \[ H(\mu|\rho)\geq \frac{\biggl(\int_{\mathbb R}zd\mu(z)\biggr)^2}{2\int_{\mathbb R}z^2d\mu(z)} \] with equality if and only if \(\mu=\rho\). We give an application to the Curie-Weiss model of self-organized criticality.
MSC:
60E15 Inequalities; stochastic orderings
60F10 Large deviations
94A17 Measures of information, entropy
PDF BibTeX XML Cite
Full Text: DOI arXiv