The author is concerned with results for finite samples, some of which are familiar as asymptotic formulas. (A) The variance of an unbiased estimate of $\theta$ is at least $1/I$, where $I$ is R. A. Fisherâ€™s amount of information. (B) If a sufficient statistic for $\theta$ and an unbiased estimate for $\theta$ exist, then among all unbiased estimates there is one with minimum variance, which is a function of the sufficient statistic. (C) If the distribution having the sufficient statistic has the usual analytic form, there is a function of $\theta$ which has an estimate satisfying (B). These results depend on differentiation under the integral sign, and are extended to several variables. In particular, if an unbiased estimate of one parameter has minimum variance, then it is uncorrelated with any statistic whose expectation depends solely on other parameters. The Riemannian metric $ds^2=g_{ij}d\theta^id\theta^j$, where $$ g_{ij}=E\left[\left(\frac 1{\phi}\frac{\partial\phi}{\partial\theta_i}\right)\left(\frac 1{\phi}\frac{\partial\phi}{\partial\theta_j}\right)\right], $$ and $\phi$ is the probability density function of the sample, is introduced into the parameter space, generating the distance proposed by Bhattacharyya [same Bull. 35, 99--109 (1943)]. The distance between two normal populations is calculated and a large-sample test for the equality of all parameters of two populations is proposed.

Reviewer: J. W. Tukey (MR0015748)