Deriving the least squares estimators of the slope and intercept (simple linear regression)

preview_player
Показать описание
I derive the least squares estimators of the slope and intercept in simple linear regression (Using summation notation, and no matrices.) I assume that the viewer has already been introduced to the linear regression model, but I do provide a brief review in the first few minutes. I assume that you have a basic knowledge of differential calculus, including the power rule and the chain rule.

If you are already familiar with the problem, and you are just looking for help with the mathematics of the derivation, the derivation starts at 3:26.

At the end of the video, I illustrate that sum(X_i-X bar)(Y_i - Y bar) = sum X_i(Y_i - Y bar) =sum Y_i(X_i - X bar) , and that sum(X_i-X bar)^2 = sum X_i(X_i - X bar).

There are, of course, a number of ways of expressing the formula for the slope estimator, and I make no attempt to list them all in this video.
Рекомендации по теме
Комментарии
Автор

Finally, someone who made it simple to understand! Thank you!

AnDrs
Автор

My god, you explained this so easily. It took me hours trying to understand this before watching this video but still couldn’t understand it properly. After watching this video, it's crystal-clear now. ❤️

sajibmannan
Автор

My Physical Chemistry teacher spent ~1.5 hrs showing this derivation and I got completely lost. Watching your video, it's so clear now. Thank you for your phenomenal work.

JFstriderXgaming
Автор

I have gone through tons of materials on this topic and they either skip the derivation process or go direct into some esoteric matrix arithmetics. This video explains everything I need to know. Thanks.

jktpgsg
Автор

I don't usually comment on teaching videos. But this really deserves thanks for how clearly and simply you explained everything. The lecture I had at the university left much to be desired

amberxv
Автор

The best part of this video is finally figuring out where that "n" came from in the equation for beta-naught-hat. Thank you so very much for making this available.

valeriereid
Автор

unbelievably perfect video, one of the best videos I have watched in the statistics field, so rare to find high-quality in this field idk why

alimortadahoumani
Автор

I never thought that I could understand simple linear regression using this approach. Thank you

augustinejunior
Автор

you have no idea how you saved my life, I was struggling so hard to find out why xi(xi-xbar)=(xi-xbar)^2 and etc. you are the first one I found explained that.

yixuanliu
Автор

phenomenal video. Thank you for taking the time to explain each step of the derivations such as the sum rule for derivation. Thank you for helping me learn.

nak
Автор

thank you for actually explaining it, most of videos are just like "hi, if you want to solve this, plug in this awesome formula and thats it, thank you for watching :)"

TheMatthyssen
Автор

I can't thank you enough for this brilliant explanation!

aaskyboi
Автор

Absolutely beautiful derivation!

Crystal clear!

Thanks very much.

monojitchatterjee
Автор

You are awesome! I am not a native speaker and still struggling with the master program courses in the US, but your instruction is so helpful. I appreciate your great help

DHDH_DH
Автор

one video on youtube that actually explains something properly

daniyal
Автор

Thank you so much! This explanation is literally perfect, helped me so much!

danverzhao
Автор

LEGEND, HAVE TO SAY YOU ARE BETTER THAN A PROFFESOR

fisher
Автор

Thank you so much for such a clear explanation! It helps me a lot in preparing for my upcoming final exam.

jingyiwang
Автор

thank you so much, this video has cleared all my confusions cuz the book im reading just says 'by doing some simple calculus'

jackhasfun
Автор

Amazing video! Slight bumps where my own knowledge was patchy but you provided enough steps for me to work those gaps out.

Murraythis