# zbMATH — the first resource for mathematics

Optimality conditions for maximizers of the information divergence from an exponent family. (English) Zbl 1149.94007
Summary: The information divergence of a probability measure $$P$$ from an exponential family $${\mathcal E}$$ over a finite set is defined as infimum of the divergences of $$P$$ from $$Q$$ subject to $$Q\in{\mathcal E}$$. All directional derivatives of the divergence from $${\mathcal E}$$ are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for $$P$$ to be a maximizer of the divergence from $${\mathcal E}$$ are presented, including new ones when $$P$$ is not projectable to $${\mathcal E}$$.

##### MSC:
 94A17 Measures of information, entropy 62B10 Statistical aspects of information-theoretic topics 60A10 Probabilistic measure theory 52A20 Convex sets in $$n$$ dimensions (including convex hypersurfaces)
Full Text:
##### References:
 [1] Ay N.: An information-geometric approach to a theory of pragmatic structuring. Ann. Probab. 30 (2002), 416-436 · Zbl 1010.62007 · doi:10.1214/aop/1020107773 [2] Ay N.: Locality of Global Stochastic Interaction in Directed Acyclic Networks. Neural Computation 14 (2002), 2959-2980 · Zbl 1079.68582 · doi:10.1162/089976602760805368 · www.ingentaconnect.com [3] Ay N., Knauf A.: Maximizing multi-information. Kybernetika 45 (2006), 517-538 · Zbl 1249.82011 · www.kybernetika.cz · eudml:33822 · arxiv:math-ph/0702002 [4] Ay N., Wennekers T.: Dynamical properties of strongly interacting Markov chains. Neural Networks 16 (2003), 1483-1497 [5] Barndorff-Nielsen O.: Information and Exponential Families in Statistical Theory. Wiley, New York 1978 · Zbl 0387.62011 [6] Brown L. D.: Fundamentals of Statistical Exponential Families. (Lecture Notes - Monograph Series 9.) Institute of Mathematical Statistics, Hayward, CA 1986 · Zbl 0685.62002 [7] Csiszár I., Matúš F.: Information projections revisited. IEEE Trans. Inform. Theory 49 (2003), 1474-1490 · Zbl 1063.94016 · doi:10.1109/TIT.2003.810633 [8] Csiszár I., Matúš F.: Closures of exponential families. Ann. Probab. 33 (2005), 582-600 · Zbl 1068.60008 · doi:10.1214/009117904000000766 · arxiv:math/0503653 [9] Csiszár I., Matúš F.: Generalized maximum likelihood estimates for exponential families. To appear in Probab. Theory Related Fields (2008) · Zbl 1133.62039 · doi:10.1007/s00440-007-0084-z [10] Pietra S. Della, Pietra, V. Della, Lafferty J.: Inducing features of random fields. IEEE Trans. Pattern Anal. Mach. Intell. 19 (1997), 380-393 [11] Letac G.: Lectures on Natural Exponential Families and their Variance Functions. (Monografias de Matemática 50.) Instituto de Matemática Pura e Aplicada, Rio de Janeiro 1992 · Zbl 0983.62501 [12] Matúš F.: Maximization of information divergences from binary i. i.d. sequences. Proc. IPMU 2004, Perugia 2004, Vol. 2, pp. 1303-1306 [13] Matúš F., Ay N.: On maximization of the information divergence from an exponential family. Proc. WUPES’03 (J. Vejnarová, University of Economics, Prague 2003, pp. 199-204 [14] Rockafellar R. T.: Convex Analysis. Princeton University Press, Priceton, N.J. 1970 · Zbl 0193.18401 [15] Wennekers T., Ay N.: Finite state automata resulting from temporal information maximization. Theory in Biosciences 122 (2003), 5-18 · Zbl 1090.68064 · doi:10.1162/0899766054615671
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.