Tutorial 26- Linear Regression Indepth Maths Intuition- Data Science

preview_player
Показать описание
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

Connect with me here:
Рекомендации по теме
Комментарии
Автор

Best explanation of cost function, we learned it as masters students and the course couldnt explain it as well.. simply brilliant

mohitpatel
Автор

Why am I not surprised with such a lucid and amazing explanation of cost function, gradient descent, Global minima, learning rate ...may be because watching you making complex things seems easy and normal has been one of my habit. Thank you SIR

soumikdutta
Автор

I have seen many teachers explaining the same concept, but your explainations are next level. Best teacher.

navjotsingh
Автор

I never understood what is a gradient descent and a cost function is until I watch this video 🙏🙏

nandinibalyapally
Автор

For those who are confused.
The convergence derivative will be dJ/dm.

anuragmukherjee
Автор

Best video on youtube to understand the intution and math(surface level) behind Linear regression.
Thank you for such great content

tarunsingh-yjlz
Автор

A small comment at 17:35. I guess it is Derivative of J(m) over m. In other words, the rate of change of J(m) over a minute change of m. That gives us the slope at instantaneous points, especially for non linear curves when slope is not constant. At each point of "m, J(m)", Gradient descent travels in the opposite direction of slope to find the Global minima, with the smaller learning rate. Please correct me if I am missing something.

Thanks for a wonderful video on this concept @Krish, your videos are very helpful to understand the Math intuition behind the concepts, I am a super beneficiary of your videos, Huge respect!!.

pjanjanam
Автор

Really awesome video, so much better than many famous online portals charging huge amount of money to teach things.

shubhamkohli
Автор

Hi Krish, Thanks for the video. Some queries/clarifications required:
1. We do not take gradient of m wrt m. That will always be 1. We take the gradient of J wrt m
2. If we have already calculated the cost function J at multiple values of m, then why do we need to do gradient descent because we already know the m where J is minimum
3. So we start with an m, calculate grad(J) at that point and update m with m' = m - grad(J)* learn_rate and repeat till we reach some convergence criteria
Please let me know if my understanding is correct.

akrsrivastava
Автор

I knew that their will be an Indian that can make all the stuffs easy !! Thanks Krish

PritishMishra
Автор

How can I not say that you are amazing !! I was struggling to understand the importance of gradient descent and u cleared it to me in the simplest way possible.. Thank you so much sir :)

RJ-dzie
Автор

No one can find easiest explanation of gradient descent on youtube. This video is the exception.

mayureshgawai
Автор

This is the best stuff i ever came across on this topic !

nanditagautam
Автор

This maths is same as coursera machine learning courses
Thank you sir for this great content ..

dhainik.suthar
Автор

The video was really great. But I would like to point out that the derivative that you took for convergence theorem, there instead of (dm/dm) it should be derivative of cost function with respect to m . Also a little suggestion at the end it would have been helpful, if you mentioned what m was, total number of points or the slope of the best fit line. Apart from this the video helped me a lot hope you add a text somewhere in this video to help the others.

ayurdubey
Автор

you just made the whole concept clear with this video, you are a great teacher

padduchennamsetti
Автор

Watched this video 3 times back to back .Now its embaded in my mind forever. Thanks Krish, great explanation !!

priyanshusharma
Автор

Best explanation of Linear Regression🙏🙏🙏.Simply wow🔥🔥

animeshkoley
Автор

Dear Krish: At 14:42' you mention that curve is called gradient descent. I believe this is not true. Gradient descent is not the name of that curve. Gradient descent is an optimization algorithm.

jamesrobisnon
Автор

So beautifully explained...did not find anywhere this kind of clarity....keepnup the good work....

annapurnaparida