On the Jensen-Shannon divergence and the variation distance for categorical probability distributions. (English) Zbl 1513.62013

Summary: We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys’ divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.


62B10 Statistical aspects of information-theoretic topics
62H05 Characterization and structure theory for multivariate probability distributions; copulas
94A17 Measures of information, entropy
Full Text: DOI