×

Some new flexibilizations of Bregman divergences and their asymptotics. (English) Zbl 1428.62036

Nielsen, Frank (ed.) et al., Geometric science of information. Third international conference, GSI 2017, Paris, France, November 7–9, 2017. Proceedings. Cham: Springer. Lect. Notes Comput. Sci. 10589, 514-522 (2017).
Summary: Ordinary Bregman divergences (distances) OBD are widely used in statistics, machine learning, and information theory. They can be flexibilized in various different ways. For instance, there are the scaled Bregman divergences SBD of W. Stummer [Proc. Appl. Math. Mech. 7, No. 1, 1050503–1050504 (2007; doi:10.1002/pamm.200700814)] and W. Stummer and I. Vajda [IEEE Trans. Inf. Theory 58, No. 3, 1277–1288 (2012; Zbl 1365.62019)] which contain both the OBDs as well the Csiszar-Ali-Silvey \(\phi\)-divergences as special cases. On the other hand, the OBDs are subsumed by the total Bregman divergences of M. Liu et al. [Proceedings 23rd IEEE CVPR, 3463.3468 (2010), IEEE Trans. Pattern Anal. Mach. Intell. 34, No. 12, 2407–2419 (2012)], B. C. Vemuri et al. [IEEE Trans. Med. Imag. 30, No. 2, 475–483 (2011)] and the more general conformal divergences COD of R. Nock et al. [IEEE Trans. Inf. Theory 62, No. 1, 527–538 (2016; Zbl 1359.94347)]. The latter authors also indicated the possibility to combine the concepts of SBD and COD, under the name “conformal scaled Bregman divergences” CSBD.
In this paper, we introduce some new divergences between (non-)probability distributions which particularly cover the corresponding OBD, SBD, COD and CSBD (for separable situations) as special cases. Non-convex generators are employed, too. Moreover, for the case of i.i.d. sampling we derive the asymptotics of a useful new-divergence-based test statistics.
For the entire collection see [Zbl 1374.94006].

MSC:

62B10 Statistical aspects of information-theoretic topics
94A17 Measures of information, entropy
PDFBibTeX XMLCite
Full Text: DOI