Corander, Jukka; Remes, Ulpu; Koski, Timo On the Jensen-Shannon divergence and the variation distance for categorical probability distributions. (English) Zbl 1513.62013 Kybernetika 57, No. 6, 879-907 (2021). Summary: We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys’ divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference. MSC: 62B10 Statistical aspects of information-theoretic topics 62H05 Characterization and structure theory for multivariate probability distributions; copulas 94A17 Measures of information, entropy Keywords:blended divergences; Chan-Darwiche metric; likelihood-free inference; implicit maximum likelihood; reverse Pinsker inequality; simulator-based inference Software:BOLFI; ELFI × Cite Format Result Cite Review PDF Full Text: DOI