##
**Multivariate statistics. A vector space approach.**
*(English)*
Zbl 0587.62097

Wiley Series in Probability and Mathematical Statistics. Probability and Mathematical Statistics. New York etc.: John Wiley & Sons. XVI, 512 p. (1983).

The purpose of this book is to present a version of multivariate statistical theory in which vector space and invariance methods replace, to a large extent, more traditional multivariate methods. A brief summary of the contents and flavor of the ten chapters follows.

In Chapter 1, the elements of vector space theory are presented. Since my approach to the subject is geometric rather than algebraic, there is an emphasis on inner product spaces where the notions of length, angle, and orthogonal projection make sense. Geometric topics of particular importance in multivariate analysis include singular value decompositions and angles between subspaces.

Random vectors taking values in inner product spaces is the general topic of Chapter 2. Here, induced distributions, means, covariances, and independence are introduced in the inner product space setting. These results are then used to establish many traditional properties of the multivariate normal distribution in Chapter 3.

In Chapter 4, a theory of linear models is given that applies directly to multivariate problems. This development, suggested by Kruskal’s treatment of univariate linear models, contains results that identify all the linear models to which the Gauss-Markov theorem applies.

Chapter 5 contains some standard matrix factorizations and some elementary Jacobians that are used in later chapters. In Chapter 6, the theory of invariant integrals (measures) is outlined. The many examples here were chosen to illustrate the theory and prepare the reader for the statistical applications to follow.

A host of statistical applications of invariance, ranging from the invariance of likelihood methods to the use of invariance in deriving distributions and establishing independence, are given in Chapter 7. Invariance arguments are used throughout the remainder of the book.

The last three chapters are devoted to a discussion of some traditional and not so traditional problems in multivariate analysis. Here, I have stressed the connections between classical likelihood methods, linear model considerations, and invariance arguments.

In Chapter 8, the Wishart distribution is defined via its representation in terms of normal random vectors. This representation, rather than the form of the Wishart density, is used to derive properties of the Wishart distribution.

Chapter 9 begins with a thorough discussion of the multivariate analysis of variance (MANOVA) model. Variations on the MANOVA model including multivariate linear models with structured covariances are the main topic of the rest of Chapter 9.

An invariance argument that leads to the relationship between canonical correlations and angles between subspaces is the lead topic in Chapter 10. After a discussion of some distribution theory, the chapter closes with the connection between testing for independence and testing in multivariate regression models.

Throughout the book, I have assumed that the reader is familiar with the basic ideas of matrix and vector algebra in coordinate spaces and has some knowledge of measure and integration theory. As for statistical prerequisites, a solid first year graduate course in mathematical statistics should suffice.

In Chapter 1, the elements of vector space theory are presented. Since my approach to the subject is geometric rather than algebraic, there is an emphasis on inner product spaces where the notions of length, angle, and orthogonal projection make sense. Geometric topics of particular importance in multivariate analysis include singular value decompositions and angles between subspaces.

Random vectors taking values in inner product spaces is the general topic of Chapter 2. Here, induced distributions, means, covariances, and independence are introduced in the inner product space setting. These results are then used to establish many traditional properties of the multivariate normal distribution in Chapter 3.

In Chapter 4, a theory of linear models is given that applies directly to multivariate problems. This development, suggested by Kruskal’s treatment of univariate linear models, contains results that identify all the linear models to which the Gauss-Markov theorem applies.

Chapter 5 contains some standard matrix factorizations and some elementary Jacobians that are used in later chapters. In Chapter 6, the theory of invariant integrals (measures) is outlined. The many examples here were chosen to illustrate the theory and prepare the reader for the statistical applications to follow.

A host of statistical applications of invariance, ranging from the invariance of likelihood methods to the use of invariance in deriving distributions and establishing independence, are given in Chapter 7. Invariance arguments are used throughout the remainder of the book.

The last three chapters are devoted to a discussion of some traditional and not so traditional problems in multivariate analysis. Here, I have stressed the connections between classical likelihood methods, linear model considerations, and invariance arguments.

In Chapter 8, the Wishart distribution is defined via its representation in terms of normal random vectors. This representation, rather than the form of the Wishart density, is used to derive properties of the Wishart distribution.

Chapter 9 begins with a thorough discussion of the multivariate analysis of variance (MANOVA) model. Variations on the MANOVA model including multivariate linear models with structured covariances are the main topic of the rest of Chapter 9.

An invariance argument that leads to the relationship between canonical correlations and angles between subspaces is the lead topic in Chapter 10. After a discussion of some distribution theory, the chapter closes with the connection between testing for independence and testing in multivariate regression models.

Throughout the book, I have assumed that the reader is familiar with the basic ideas of matrix and vector algebra in coordinate spaces and has some knowledge of measure and integration theory. As for statistical prerequisites, a solid first year graduate course in mathematical statistics should suffice.

### MSC:

62Hxx | Multivariate analysis |

62-02 | Research exposition (monographs, survey articles) pertaining to statistics |

62A01 | Foundations and philosophical topics in statistics |

62H15 | Hypothesis testing in multivariate analysis |

28C10 | Set functions and measures on topological groups or semigroups, Haar measures, invariant measures |