×

On metric divergences of probability measures. (English) Zbl 1186.94421

Summary: Standard properties of \(\phi\)-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of \(\phi \)-divergences, or the metricity of their powers. This paper extends the previously known family of \(\phi \)-divergences with these properties. The extension consists of a continuum of \(\phi \)-divergences which are squared metric distances and which are mostly new but include also some classical cases like e.g. the Le Cam squared distance. The paper establishes also basic properties of the \(\phi \)-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.

MSC:

94A17 Measures of information, entropy
68T10 Pattern recognition, speech recognition
62B10 Statistical aspects of information-theoretic topics
62H30 Classification and discrimination; cluster analysis (statistical aspects)
PDFBibTeX XMLCite
Full Text: EuDML Link

References:

[1] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299-318. · Zbl 0157.25802
[2] I. Csiszár: On topological properties of \(f\)-divergences. Studia Sci. Math. Hungar. 2 (1967), 329-339. · Zbl 0157.25803
[3] B. Fuglede, T. Topsøe: Jensen-Shannon divergence and Hilbert space embedding. Proc. IEEE Internat. Symposium on Inform. Theory, IEEE Publications, New York 2004, p. 31.
[4] P. Kafka, F. Österreicher, I. Vincze: On powers of \(f\)-divergences defining a distance. Stud. Sci. Math. Hungar. 26 (1991), 329-339. · Zbl 0771.94004
[5] M. Khosravifard, D. Fooladivanda, T. A. Gulliver: Confliction of the convexity and metric properties in \(f\)-divergences. IEICE Trans. on Fundamentals E90-A (2007), 1848-1853. · doi:10.1093/ietfec/e90-a.9.1848
[6] V. Kůs, D. Morales, I. Vajda: Extensions of the parametric families of divergences used in statistical inference. Kybernetika 44 (2008), 95-112. · Zbl 1142.62002
[7] L. Le Cam: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986. · Zbl 0605.62002
[8] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987. · Zbl 0656.62004
[9] F. Liese, I. Vajda: On divergence and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412. · Zbl 1287.94025 · doi:10.1109/TIT.2006.881731
[10] K. Matusita: Decision rules based on the distance for problems of fit, two samples and estimation. Ann. Math. Statist. 26 (1955), 631-640. · Zbl 0065.12101 · doi:10.1214/aoms/1177728422
[11] F. Öesterreicher: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 389-393. · Zbl 0897.60015
[12] F. Österreicher, I. Vajda: A new class of metric divergences on probability spaces and its statistical applications. Ann. Inst. Statist. Math. 55 (2003), 639-653. · Zbl 1052.62002 · doi:10.1007/BF02517812
[13] I. Vajda: On the \(f\)-divergence and singularity of probability measures. Period. Math. Hungar. 2 (1972), 223-234. · Zbl 0248.62001 · doi:10.1007/BF02018663
[14] I. Vincze: On the concept and measure of information contained in an observation. Contributions to Probability (J. Gani and V. F. Rohatgi, Academic Press, New York 1981, pp. 207-214. · Zbl 0531.62002
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.