Linear Regression with Matrices

preview_player
Показать описание
Most of our coverage of linear regression has focused on an algebraic approach to deriving the coefficients and estimating parameters. However, we can accomplish the same feat by using matrices to represent all our coefficients, variance estimates, and other regression-related quantities. In this lecture we describe how linear regression can be approached using matrices. An example of the calculations "by hand" is provided (two other lectures show the example in R and SAS). Knowing the matrix approach is beneficial since it can be more efficient and (eventually) easier to think about other complex models.


Table of Contents:

00:00 - Intro Song
00:19 - Welcome
00:54 - Linear Regression in Matrix Notation
04:29 - The Design Matrix
05:42 - The Hat Matrix
06:52 - Deriving the Beta Coefficients
09:05 - Simple Linear Regression with Matrices
11:51 - Properties of Least Squares Estimators
13:08 - Y-hat, Residuals, and Sums of Squares
15:05 - Variances and Covariances
17:06 - Confidence and Prediction Intervals
18:32 - Variance of beta-hat
19:00 - SLR Example (By Hand)
25:23 - Appendix: Code for Matrices in SAS and R
Рекомендации по теме