Machine Learning | Gradient Descent (with Mathematical Derivations)

preview_player
Показать описание
Gradient descent is an optimization algorithm used to minimize some functions by iteratively moving in the direction of the steepest descent as defined by the negative of the gradient. In machine learning, we use gradient descent to update the parameters of our model. #MachineLearning #GradientDescent #datascience

----------------------------------------------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------------------------------------------------
Chapters 🕰️
0:00 Introduction to regression and terminologies
7:40 Relationship of Error with slope & intercept
13:00 Total Derivatives
15:46 Partial Derivatives
17:43 Differentiating LMS
25:10 Resubtitution method (Linear Algebra)
-------------------------------------------------------------------------------------------------------------------------------------------

Рекомендации по теме
Комментарии
Автор

What is great about this particular video is these concepts are explained well in many places like scattered dots, you connected the dots to paint the whole picture . an example for gradient descent included - very helpful .

sudiptodas
Автор

You just helped me understand hundreds of web pages that talked about topics with no order. Thank you

gabelster
Автор

You are such a great teacher. Concepts are clearly explained beginning with the basics and slowly easing into the most advanced level. Thank you

donaldngwira
Автор

Hi, Your video is helpful for beginners to understand the concept. One suggestion: In the very beginning of the video when you write the equation of your predicted line remember to mark it as y(cap) = mx(i) + c. It is not y(i) which is the actual data point.

rajapal
Автор

My god you are perfect I think your work should reach more audience your best and clear than the renowned ML yputubers. Applause Ranji

subramaniarumugam
Автор

this might be the most underrated explanation on youtube

aryandeshpande
Автор

The best and clear explanation I've ever listened about Gradient Descent. Keep up the good work!🙌

yashdhawade
Автор

When my ML teacher teaching this, I felt I am learning some rocket science, but you are teaching it felt very easy, thank you Sir😊

kvv
Автор

This is what I pay my internet bill for! Thanks a lot!

vedanthbaliga
Автор

Your hard work made the concept very easy to grasp.

sumanmondal
Автор

I am a beginner and as a beginner I was struggling understand the gradient decent concept. I have seen many videos on gradient decent however all of them skipped explaining the derivative part however you explained it very well both (total and partial) with solution . Thanks!

manishjain
Автор

I was unable to understand this topic tried many videos but this was the most useful video thankss

chinmaysrivastava
Автор

its a very good description, The way you teach is humble and appreciatable.

amarnammilton
Автор

Everything is so easy on this channel, great work Man!

Sagar_Tachtode_
Автор

Man, you've won my heart, you kept it so simple, best way of explaining Gradient Descent. Can you please help me in using learning rate in the equation and number of steps used in gradient descent with an example.

vishaldas
Автор

Hey thank you so much for this content since I started studying regression using your videos, I became huge fan of yours

swaroopthomare
Автор

Hi @ranjiraj, at 21:37 u have given a wrong explanation in the partial derivative w.r.t C...
d/dC (-C) will be -1, then why are you treating it as constant whereas in d/dc mx should be taken as constant.

shaun
Автор

Well explained bro ❤ just bring another video for statistics and linear algebra 🎉

trendhindifacts
Автор

Very good explanation. It would've been good if you could've explained the usage of learning rate usage to find a minimum point.

jayaprakashs
Автор

Sir thank you very much, this has been so helpful since my course will only get tougher from here onwards and u helped me understand the basics

theysigankumar