Machine Learning Tutorial Python - 4: Gradient Descent and Cost Function

preview_player
Показать описание
In this tutorial, we are covering few important concepts in machine learning such as cost function, gradient descent, learning rate and mean squared error. We will use home price prediction use case to understand gradient descent. After going over math behind these concepts, we will write python code to implement gradient descent for linear regression in python. At the end I've an an exercise for you to practice gradient descent

#MachineLearning #PythonMachineLearning #MachineLearningTutorial #Python #PythonTutorial #PythonTraining #MachineLearningCource #CostFunction #GradientDescent

Topics that are covered in this Video:
0:00 Overview
1:23 - What is prediction function? How can we calculate it?
4:00 - Mean squared error (ending time)
4:57 - Gradient descent algorithm and how it works?
11:00 - What is derivative?
12:30 - What is partial derivative?
16:07 - Use of python code to implement gradient descent
27:05 - Exercise is to come up with a linear function for given test results using gradient descent

Topic Highlights:
1) Theory (We will talk about MSE, cost function, global minima)
2) Coding - (Plain python code that finds out a linear equation for given sample data points using gradient descent)
3) Exercise - (Exercise is to come up with a linear function for given test results using gradient descent)

Next Video:

Populor Playlist:

#️⃣ Social Media #️⃣
Рекомендации по теме
Комментарии
Автор

I've been struggling with my online lectures on machine learning. Your videos are so helpful. I can't thank you enough!

IVIRnathanreilly
Автор

TBH man, I went through 3–4 different ways to learn these concepts and got nothing. I genuinely felt like giving up. But YOU… you’re something else. Just one go — clean, clear, and it all finally made sense. You’re a god at explaining. Massive respect 🙌🔥

uselesschannel
Автор

3Blue1Brown is a great channel so is your explanation. Kudos to you!
Also, it is quite appreciable how you positively promote and credit other's good work. That kind of Genuity is much needed.

angulimaldaku
Автор

For people who wants to know whats behind of scene:
The reason we get partial derivative m t function (mse): - 2/n (summation) x_i ( y_i (mx_i+b)) is due to chain rule in calculus.
We want to take m deriviative and as you see m would be gone as m^(1) and m^(1-1) = 1 and leave only x_i. with chain rule we dissect the function.
so suppose we have random function F(m)= (am+b)^2, we would deal with (am+b)^2 first -> 2*(am+b) X df/dm (am+b) -> 2*(am+b) X a . likewise you'd use chain rule for same MSE above. and get - 2/n (summation) x_i ( y_i (mx_i+b))
Please don't accept as it is then you never learn why things are working completely and come up with your own solution. Easy way is never get you where you want it.

mdlwlmdddwd
Автор

Finally, found the best ML tutorials. Coding with mathematics combined and explained very clearly. Thank you!

waytosoakshya
Автор

I have gone through so many materials and couldn't understand a thing on these, but this video is amazing .Thanks for putting all you videos.

mamtachaudhary
Автор

I followed tonnes of tutorials on gradient descent. Nothing came close to the simplicity of your explanation. Now I have a good grasp of this concept! thanks for this sir!

SudiKrishnakum
Автор

Machine learning tutorials with exercises:

codebasics
Автор

I’m so excited to see you uploaded a new video on machine learning. I’ve watched your other 3 a couple of times. They’re really top notch. Thank you. Please keep this series going. You’re a great teacher too.

officesuperhero
Автор

This is the best tutorial i have ever seen. This is truly from scratch. Thank you so much

alidi
Автор

You explained in the simplest way this complex concept. Best teacher in the world 🎉🎉

VijaykumarS
Автор

who are the people disliking these videos. These people work hard and make these videos for us. Please if you don't like it, don't watch it but don't dislike it. It is misleading to the people who come to watch these videos. I know many of us have studied some of these concepts before, but he is making videos for everyone and not for a few section of people. I feel that this channel's videos are amazing and doesn't deserve any dislikes.

AYUSHKUMAR-dmxg
Автор

You are the best teacher for data science... thanks

rishabkumar
Автор

Thank you so much for the detailed explanation! I have difficulties understanding these theories but most of the channels just explain without mentioning the basics. With your explanation, it is now it is soooo clear! amazing!!

vanlindertpoffertje
Автор

It has become so clear that I am gonna teach it to my dog.

mukulbarai
Автор

I have no words to thank you, such clearly explained. Learning ML while combining Andrew NG and your explanations side by side. Bundle of thanks again from Karachi, Pakistan

MuhammadAsif-rpx
Автор

If there will be an award of best teacher of theworld so the the award would go to this person, programming hero and brackeys

ecgisamal
Автор

Omg!!! This is my first time seeing people to calculate how gradients decent works!!!!

daychow
Автор

This was an excellent explanation! Not too technical and explained in simple terms without losing its key elements. I used this to supplement Andrew Ng's Machine learning course on Coursera (which has gotten technical real quick) and it's been really helpful thanks

jenglong
Автор

It's the most helpful video I have seen till now on gradient descent . Great work . Looking forward for more videos on machine learning .

ayushlabh