Csiszár, Imre Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. (English) Zbl 0753.62003 Ann. Stat. 19, No. 4, 2032-2066 (1991). Summary: An attempt is made to determine the logically consistent rules for selecting a vector from any feasible set defined by linear constraints, when either all \(n\)-vectors or those with positive components or the probability vectors are permissible. Some basic postulates are satisfied if and only if the selection rule is to minimize a certain function which, if a “ prior guess” is available, is a measure of distance from the prior guess. Two further natural postulates restrict the permissible distances to the author’s \(f\)-divergences and Bregman’s divergences [L. M. Bregman, USSR Comput. Math. Math. Phys. 7, No. 3, 200–217 (1969); translation from Zh. Vychisl. Mat. Mat. Fiz. 7, 620–631 (1967; Zbl 0186.23807)], respectively.As corollaries, axiomatic characterizations of the methods of least squares and minimum discrimination information are arrived at. Alternatively, the latter are also characterized by a postulate of composition consistency. As a special case, a derivation of the method of maximum entropy from a small set of natural axioms is obtained. Cited in 3 ReviewsCited in 129 Documents MSC: 62B10 Statistical aspects of information-theoretic topics 62A01 Foundations and philosophical topics in statistics 94A08 Image processing (compression, reconstruction, etc.) in information and communication theory Keywords:image reconstruction; logically consistent inference; nonlinear projection; nonsymmetric distance; linear constraints; selection rule; prior guess; f-divergences; Bregman’s divergences; least squares; minimum discrimination information; composition consistency; method of maximum entropy Citations:Zbl 0186.23807 PDF BibTeX XML Cite \textit{I. Csiszár}, Ann. Stat. 19, No. 4, 2032--2066 (1991; Zbl 0753.62003) Full Text: DOI