Matrix methods have evolved from a tool for expressing statistical problems to an indispensable part of the development, understanding, and use of various types of complex statistical analyses. This evolution has made matrix methods a vital part of statistical education. Traditionally, matrix methods are taught in courses on everything from regression analysis to stochastic processes, thus creating a fractured view of the topic. Matrix Algebra for Linear Models
offers readers a unique, unified view of matrix analysis theory (where and when necessary), methods, and their applications. Written for future statisticians, both theoretical and applied, this book emphasizes the key topics that are needed in a concise and accurate way. Emphasis is on understanding and interpreting principal components as an eigenvalue, generalized inverses, and singular value decomposition. The derivation of important results in Analysis of Variance (ANOVA) is made elegant by the use of some of the properties of quadratic forms, the Kronecker product, and special matrices. A large number of numerical examples and exercises are included to further illustrate the motivation behind the concepts.
Keywords: Applied Probability & Statistics - Models