This textbook provides a wide-ranging introduction to the use and theory of linear models for analyzing data. The author's emphasis is on providing a unified treatment of linear models, including analysis of variance models and regression models, based on projections, orthogonality, and other vector space ideas. Every chapter comes with numerous exercises and examples that make it ideal for a graduate-level course. All of the standard topics are covered in depth: estimation including biased and Bayesian estimation, significance testing, ANOVA, multiple comparisons, regression analysis, and experimental design models. In addition, the book covers topics that are not usually treated at this level, but which are important in their own right: best linear and best linear unbiased prediction, split plot models, balanced incomplete block designs, testing for lack of fit, testing for independence, models with singular covariance matrices, diagnostics, collinearity, and variable selection. This new edition includes new sections on alternatives to least squares estimation and the variance-bias tradeoff, expanded discussion of variable selection, new material on characterizing the interaction space in an unbalanced two-way ANOVA, Freedman's critique of the sandwich estimator, and much more.