Gradient descent simple explanation|gradient descent machine learning|gradient descent algorithm

preview_player
Показать описание
Gradient descent simple explanation|gradient descent machine learning|gradient descent algorithm
#gradientdescent #unfolddatascience
Hello All,
My name is Aman and I am a data scientist. In this video I explain gradient descent piece by piece. In this video, my intention is to make gradient descent extremely simple to understand. Gradient descent being a very important algorithm for machine learning and deep learning is a must know topic for every data scientist. Below questions are answered in this video:
1. What is gradient descent?
2. How gradient descent works?
3. Gradient descent algorithm?
4. What is gradient descent in machine learning?
5. What is gradient descent in deep learning?
6. How gradient descent algorithm works?

About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.

Join Facebook group :

Follow on twitter : @unfoldds

Follow on Instagram : unfolddatascience

Watch python for data science playlist here:

Watch statistics and mathematics playlist here :

Watch End to End Implementation of a simple machine learning model in Python here:

Learn Ensemble Model, Bagging and Boosting here:

Access all my codes here:

Рекомендации по теме
Комментарии
Автор

My question is when we calculate Partial derivative with respect to 'c' and 'm', we should consider one as constant.For example
to calculate partial derivative of cost function J with respect to c ∂J/∂c , we should consider 'm' as constant .So the above calculation should be like this. -2[2 - (c+m)] + (-2)[4-(c+3m)] => -2[2-(c)]+(-2)[4-(c)] => -2[2] -2[4] =>-4-8=> -12.

Please confirm

Agrima_Art_World
Автор

Went through lots of articles but didn't understand the core. But your video made it clear within 15 minutes :) Just awesome keep up the good work :)

aparnasingh
Автор

I like how you simply talk and explain .Especially, your speed is enough for everyone to understand such a tough concept like gradient descent.

thaitran
Автор

You're just amazing! Anyone can understand gradient descend by watching this video. Thanks!

paonsgraphics
Автор

the info you have given at arount 7:00 was very insightful. it shows why gradient always points in the direction of steepest ascent. thank you

melihulugyldz
Автор

This is one of the best explanation video about Gradient Descent, I like your detailed explaination. Looking forward for more videos on various Optimizers.
Thank you

balug
Автор

Hi Aman sir, i am a PhD scholar (almost completed) from IIT Madras and since last few months just for my interest i was exploring DS, ML and DL (though i am not from CS background) and landed with some of your videos on UDS which really increase my curiosity to learn more about it. Though i have explored a lot online videos and many other sources on ML including corsera etc. but i can say that your explanations are extremely good for conceptually to understand the subject. Just i am not able to control myself without appreciating your great effort and you are doing really a great job/help for the aspirants of ML/DS. Thank a lot for all what you are doing.

brijkishortiwari
Автор

This is the first time I'm learning about Gredient Desent, and I understood how algorithms work. This video is amazing. Thank you so much.

pankajgoikar
Автор

THE best explanation so far, short consice and precise, Algorithm, minimimizing loss function by having the optimal parameters

cyzfozi
Автор

Why do I understood everything... Thankyou so much sir ❤️❤️❤️❤️❤️.

Ts_kashyap
Автор

Great explanation. Thank you for this.

zakikhurshid
Автор

Great great lecture for gradient descent I have ever seen....thank u so much for sharing ur knowledge sir ❤️

ManiKandan-lxzg
Автор

Nice Explanation!
Liked Your Simplicity...

aneeshwath
Автор

Thank you so much aman. I don't know why I didn't come across to your videos till now. Keep up the good work. :)

akashmanojchoudhary
Автор

Simply wow. After a month I understood today Gradient Descent. Thank you soo much for the video 😊

himanshirawat
Автор

Awesome explanation Sir!! Great work!!

sukhleenkaur
Автор

Excellent !!! I have liked the video very much!!! You taught and asked question exactly what I was looking for! And I was looking for such explanation. Go ahead, sir.

evanshareef
Автор

This is the best explanation that I have seen on Gradient descent, i mean, I already had a great idea of what it was but you took me to a different level. Thank you!

chriseyebagha
Автор

Your teaching method is masterful! None of the books I read go to such depths. Thanks!!

elcheapo
Автор

New value=old value-learning rate * slope

this made me understand the whole concept within seconds.

Thank you Sir!

aaronlopes