zbMATH — the first resource for mathematics

Differential-geometrical methods in statistics. (English) Zbl 0559.62001
Lecture Notes in Statistics, 28. Berlin etc.: Springer-Verlag. V, 290 p. DM 43.00 (1985).
A parametric statistical model, i.e. a family of probability measures \((P_{\theta})\), where \(\theta\) runs through an open subset of some Euclidean space, may be regarded as a differential manifold. It is natural to furnish it with the Riemannian metric, induced by the Fisher information tensor. One might hope that this geometry tells something about statistical properties of the model, on the other hand several non- metrical distances like Kullback-Leibler-distance have been applied in different statistical situations with profit. This indicates that the Riemannian approach is too narrow for statistical purposes and additional geometrical concepts are needed.
The state of affairs today can be read from this monograph of the author who contributed substantially to this theory. The book is divided into two parts. In the first part fundamental differential-geometric concepts for statistical manifolds are introduced (starting with tangent space and the like, which makes the book accessible also for those, who lost all their differential-geometric knowledge).
Contrarily to ordinary differential geometry a whole family of affine connections is defined. It contains the Riemannian connection induced by Fisher information, but other non-Riemannian connections turn out to be of even greater significance. Thus one has different measures of curvature and different notions of flatness and the like. Each of the affine connections is coupled to another one by a concept of duality. This duality is used to introduce a family of divergence measures between probability distributions, which include Kullback-Leibler’s distance, Hellinger’s distance, Csiszar’s divergence, etc. Statistical manifolds carry a geometrical structure, which apparently has not been considered so far.
The second part of the book contains applications to statistical inference, especially higher-order theory. Edgeworth expansions of the distribution of certain sufficient statistics are given explicitly in geometric terms. In estimation as well as in test theory curvatures related to different connections are shown to come into play. Other key- words are: Interval-estimators, first-, second-, third-order efficiency, ancillarity, conditional inference, nuisance parameters, jackknifing.
Altogether the book presents a readable introduction to a theory, which promises interesting developments in the future.
Reviewer: G.Kersting

62-02 Research exposition (monographs, survey articles) pertaining to statistics
62F05 Asymptotic properties of parametric tests
62F12 Asymptotic properties of parametric estimators
53B21 Methods of local Riemannian geometry
53B15 Other connections
53B05 Linear and affine connections