Estimation of variance components and applications.

*(English)*Zbl 0645.62073
North-Holland Series in Statistics and Probability, Vol. 3. Amsterdam etc.: North-Holland. XIII, 370 p.; Dfl. 160.00 (1988).

This monograph is an important contribution to the field of statistics. The material is well organized, and the topics are presented in a logical sequence. To be able to understand it, one must have a sound background in matrix analysis and the mathematical theory of statistics. It is well indexed, and a generous list of references is provided at the end of the monograph.

Since the tools of matrix analysis are heavily used throughout the monograph to obtain explicit expressions for different estimators of variance and covariance components, Chapter 1 is set aside, besides stating routine matrix methods, for a detailed discussion of topics such as g-inverses of matrices, quadratic subspaces, and quasi-inner products.

The next two chapters deal with mixed linear models under a unified frame, and with developing tools for obtaining the asymptotic distributions of quadratic functions of random variables. Identifiability and estimability of \(\beta\) and \(\theta =(\theta_ 1,\theta_ 2,...,\theta_ r)\) in the general variance components model \[ Y=X\beta +\epsilon,\quad \epsilon \sim (0,\quad V(\theta)=\theta_ 1V_ 1+...+\theta_ rV_ r) \] are discussed in Chapter 4.

The minimum norm quadratic estimation (MINQE), introduced by the first author [see J. Multivariate Anal. 1, 257-275 (1971; Zbl 0223.62086)], which depends upon prior values of unknown parameters and natural estimators based on the structure of the linear model is developed in Chapter 5, and the expressions derived for different kinds of estimators involve the calculations of inverses of large-size matrices. Efficient algorithms to obtain such inverses form the subject matter of Chapter 8.

In Chapter 7, the authors study the conditions under which MINQE estimators are independent of the prior choices of unknown parameters and natural estimators, while the pooling of all the available information from different sets of independent data for the efficient estimation of the same set of parameters is discussed in Chapter 6.

In order to reduce the arbitrariness involved in the choice of prior values of the parameters, the concept of iterated MINQE (IMINQE) is introduced in Chapter 9 which leads to a rich class of estimators. In the next chapter asymptotic properties of MINQE’s are derived. The last two chapters contain material on minimum variance quadratic estimators and the conditions under which they do not involve the higher moments, and on some practical applications to selection problems involving the prediction of some hypothetical variable (criterion variable) on the basis of certain measurements already made on the individuals.

This monograph is carefully prepared and the material is presented in a rigorous fashion. There are very few examples in the monograph. Graduate students and researchers in statistics will find this volume on the estimation of variance components very informative and interesting. Problems and complements at the end of each chapter add to the value of this monograph, and most of them are quite challenging to try.

Since the tools of matrix analysis are heavily used throughout the monograph to obtain explicit expressions for different estimators of variance and covariance components, Chapter 1 is set aside, besides stating routine matrix methods, for a detailed discussion of topics such as g-inverses of matrices, quadratic subspaces, and quasi-inner products.

The next two chapters deal with mixed linear models under a unified frame, and with developing tools for obtaining the asymptotic distributions of quadratic functions of random variables. Identifiability and estimability of \(\beta\) and \(\theta =(\theta_ 1,\theta_ 2,...,\theta_ r)\) in the general variance components model \[ Y=X\beta +\epsilon,\quad \epsilon \sim (0,\quad V(\theta)=\theta_ 1V_ 1+...+\theta_ rV_ r) \] are discussed in Chapter 4.

The minimum norm quadratic estimation (MINQE), introduced by the first author [see J. Multivariate Anal. 1, 257-275 (1971; Zbl 0223.62086)], which depends upon prior values of unknown parameters and natural estimators based on the structure of the linear model is developed in Chapter 5, and the expressions derived for different kinds of estimators involve the calculations of inverses of large-size matrices. Efficient algorithms to obtain such inverses form the subject matter of Chapter 8.

In Chapter 7, the authors study the conditions under which MINQE estimators are independent of the prior choices of unknown parameters and natural estimators, while the pooling of all the available information from different sets of independent data for the efficient estimation of the same set of parameters is discussed in Chapter 6.

In order to reduce the arbitrariness involved in the choice of prior values of the parameters, the concept of iterated MINQE (IMINQE) is introduced in Chapter 9 which leads to a rich class of estimators. In the next chapter asymptotic properties of MINQE’s are derived. The last two chapters contain material on minimum variance quadratic estimators and the conditions under which they do not involve the higher moments, and on some practical applications to selection problems involving the prediction of some hypothetical variable (criterion variable) on the basis of certain measurements already made on the individuals.

This monograph is carefully prepared and the material is presented in a rigorous fashion. There are very few examples in the monograph. Graduate students and researchers in statistics will find this volume on the estimation of variance components very informative and interesting. Problems and complements at the end of each chapter add to the value of this monograph, and most of them are quite challenging to try.

Reviewer: D.V.Chopra

##### MSC:

62J10 | Analysis of variance and covariance (ANOVA) |

62-02 | Research exposition (monographs, survey articles) pertaining to statistics |

15A09 | Theory of matrix inversion and generalized inverses |