Linear Regression - Least Squares Criterion Part 1

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

I used to watch your videos when i was doing my math courses (in engineering) but its been a while since i watched your videos since i got done with all my math courses two semesters ago. However when i YouTube'd "Linear Regression" for a statistic class. I was so happy to see your name on the top of the list. You were the BEST YouTube tutor i had and i definitely missed watching your videos.

zhd
Автор

Dear Patrick! I've been using your videos for about 3 years, last year of IB and upto this point (2nd year elctrical engineering). I would just like to thank you tons for all these helpful videos you're sharing. You have a very good pedagogical approach. The best i've ever seen :))) Thankyou!!!

meedan
Автор

Finally someone that speaks in the language I understand. Thank you for not taking anything for granted. You are an amazing teacher! Kudos bro!

mugume
Автор

This helped me so much, because my teacher honestly taught me nothing.

Musicllya
Автор

Great video Patrick. I'm a CPA, and venturing into big data, machine learning (high level knowledge) and this really helped. Its a challenge explaining this stuff to execs, and this is great

Автор

I really like how you described Least Squares and Linear Regression in a long way because I was able to get all of the notes down from this video :) Thanks @patrickJMT

kaylacumming
Автор

You helped me through college and now you're helping me at work! Much love!

CenturyBreakdownX
Автор

Thank you Patrick for making this so easy to understand. You're a good teacher.

hiendelong
Автор

I sat through a 1.5 hr lecture, didn't understand a thing. Watched this and now I understand at least the concept. Thank you!

paulrobbinx
Автор

The text book makes this 20 times more complicated than it actually is. This is fantastic, thank you.

johnnywayne
Автор

You got it right already.

Squaring makes small numbers smaller and large numbers larger. A large error is much worse than a small error, so by squaring all errors, the larger errors are indeed overemphasised

kazaakas
Автор

I think the problem with the other stuff I was reading was they were trying to pretend that the concept was much more complicated than it really was. Like, they were trying to fit too much stuff in too early. This was a perfect, casual explanation that gave me a good idea of what you were talking about before moving on. Thank you!

gnram
Автор

Dammit I just had my exam with this in it today...!

Good luck to all the future generations, when PatrickJMT has uploaded explanations to all the maths that ever was :P

jenzo
Автор

Thank you so much. I literally learn pretty much everything we do in our numerical method class from your videos.

CNsongs
Автор

Thank-you Patrick for explaining it in such an easy to follow format.

kimberleytaylor
Автор

Nice job!!! Made me understand in simple language. Thank a lot

mahasish
Автор

Excellent explanation, was looking at Wikipedia and didn't quite get it as fast as watching your graph demonstration. Good job!

chutsu_io
Автор

It is because the error ('D' here) will be zero if you don't square them. All the points under the line are negative are and the ones above the line are positive. A basic assumption of OLS is that these will sum to zero always. That is the definition of the best fit line. Therefore in order to get the error, you have to square them.

Highlander
Автор

I remember taking regression in college a few semesters back. It was challenging, but I definitely learned a lot.

Dataisthetruth
Автор

omg my whole lecture makes so much more sense now. thank you for this

mildaonadoronenkovaite