Linear Regression Gradient Descent | Machine Learning | Explained Simply

preview_player
Показать описание
Understand what is Linear Regression Gradient Descent in Machine Learning and how it is used. Linear Regression Gradient Descent is an algorithm we use to minimize the cost function value, so as to fit our model to the data set.

Gradient Descent finds the minima of cost function, by using the derivative of the cost function w.r.t parameters. Without applying Gradient Descent, we cannot train any model in Machine Learning.

Learn in the video, how it works !

------------------------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------------------------

------------------------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------------------------

I am dedicated to help you Learn Machine Learning in a cool way !
Рекомендации по теме
Комментарии
Автор

If you found any value from the video, hit the red subscribe and like button 👍. I would really love your support! 🤗🤗

CodingLane
Автор

Finally! I found something useful. Thanks a lot, everyone teaches working of gradient descent in very crude way, but almost no one teaches the maths behind it. Almost everyone simply imports gradient descent from some library and no one shows pseudo code. I wanted to understand the working behind those functions, how these parameters get adjusted, and what maths is getting used behind the scenes, so if required we can create our own functions, and this video fulfilled all these requirements.

aniketkumar
Автор

Your explanation is really good. It would be helpful if you could make video playlists on Linear Algebra, Optimization and Calculus.

vinyasshreedhar
Автор

The best ever explanation with detailed mathematical explanation

Sansaar_Ek_Vistaar
Автор

This is a great explanation of gradient descent! Thank you!

simonwang
Автор

Last 5 minutes were epic 😍... Thanks 💙

IbrahimAli-kxkp
Автор

Thanks for this. I'm learning data analytics but I come from a profession with little math, so it's challenging.

mrguitaramateure
Автор

i usually never comment, but this was so simple and easy to understand ty

ananthdev
Автор

Don't stop! This was more than helpful!

johans
Автор

Your way of explaining things were just amazing!!, I got all u wanted to explain , thanks..

aayushjitendrakumar
Автор

Keep going on bro u are clearing my concepts, please make a playlist on python tutorials

purplefoxdevs
Автор

Nice explanation point to point explanation others only give confusions 😅

demon
Автор

hello, in 11:00 why did you multibly the m with 2? in the previos video there was only m

JJ-pzdx
Автор

hey, can you please help me to solve this question?
Question: . You run gradient descent for 15 iterations with α=0.4 and compute J(θ) after each iteration. You find that the value of J(θ) increases over time. Based on this, please describe how do you choose a suitable value for α.

wthejnu
Автор

can you solve questions too please, all the video you explained...

vl...
Автор

tysm really appreciate your explanation

saketh
Автор

Thank you a lot for this. Your explanation helped to wrap my head around gradient descent !

Pubuditha
Автор

Great explanation. Please make a video on knn too.

jessicasaini
Автор

why do u ignore the -ve sign in the partial derivative

jokebro
Автор

hello, I believe that sigma goes from zero to m not from 1 to m, anyway thanks for the great explanation

hamza