How to implement Linear Regression from scratch with Python

preview_player
Показать описание
In the second lesson of the Machine Learning from Scratch course, we will learn how to implement the Linear Regression algorithm.

Welcome to the Machine Learning from Scratch course by AssemblyAI.
Thanks to libraries like Scikit-learn we can use most ML algorithms with a couple of lines of code. But knowing how these algorithms work inside is very important. Implementing them hands-on is a great way to achieve this.

And mostly, they are easier than you’d think to implement.

In this course, we will learn how to implement these 10 algorithms.
We will quickly go through how the algorithms work and then implement them in Python using the help of NumPy.

▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

#MachineLearning #DeepLearning
Рекомендации по теме
Комментарии
Автор

This channel is amazing ❤. This is the type of content a lot of instructors forget to teach you when you’re learning ML but this girl explains everything very well from scratch. Congratulations for your content and I hope to watch more of your videos, you deserve more views for this incredible job.

Artificial_Intelligence_AI
Автор

I have used Linear Regression many times, but never implemented from scratch. Thanks for an awesome video. Waiting for the next one.

afizs
Автор

I rarely bother commenting because I'm rarely impressed. This video was amazing. I love that you're showing theory and OOP. Usually I see basic definitions and code all in one script.

tienshinhan
Автор

Wow, that really was from scratch. And the hardest way possible. But it's perfect for teaching python. Thanks!

bendirval
Автор

Thank you for this video. Before i wached it i spend a couple of day to understand how to make custome code without frameworks ))

MU_
Автор

This is amazing. Thank you so much for all your clear explanations. You really know you stuff, and you make learning this complex material fun and exciting.

markkirby
Автор

I was seeking such kind of lectures. Lectures are awesome

haseebmuhammad
Автор

I think there is a problem of missing the 2 multiplication in calculating dw, db in the .fit() method:
dw = (1/n_samples) * np.dot(X.T, (y_pred-y)) * 2
db = (1/n_samples) * np.sum(y_pred-y) * 2
If we follow the slides, it's not absolutely wrong but it can affect the learning rate

lamluuuc
Автор

Thank you so much, this video really helped me get started with understanding machine learning algorithms. I would love if you could do a video on how you would modify the algorithm for multivariate linear regression.

ayo
Автор

Thank u so much for this video. 💖💖

It's makes us feel more confident when we know how to do it drom scratch than using libraries ✨

ldybu
Автор

Another amazing video! Slight typo in the definition of matrix multiplication and =dw part as well as an omission on the constant 2 (which does not effect calculations much) in the code when you define the gradients but other than that this is beautiful 😃

GeorgeZoto
Автор

Thanks for making it easy to follow along!

kamranzamanni
Автор

Great explanations indeed. The transition from theory to implementation in Python was awesome!
Say, is this a good starting point for a beginner in Data Science or I should stick to the out-of-the-box sklearn methods for now?

KelvinMafurendi
Автор

Amazing video! I learned so much from it! Congrats!!!
Could explain more detailed all of this and show next steps like "where" and "how" this can be implemented further in some scenarios?

marcelo
Автор

Amazing video, I liked the code and the explanations, it was easy to read and understand, thanks! 😁👍👏💯

luisxd
Автор

followed the tutorial exactly right, but still different. Using trial version. Thank you*

sajinsanthosh
Автор

I am not sure if you copied this code from Patrick Loeber. He has a youtube video with the same code posted years ago. If you did, please give credit to Patrick.
This is the name of his video: Linear Regression in Python - Machine Learning From Scratch 02 - Python Tutorial

AndroidoAnd
Автор

The gradient calculation lacks the multiplication by coefficient 2, I guess.

m.bouanane
Автор

Do you not need to np.sum() the result of np.dot(x, (y_pred-y)) in dw as well as multiply by 2?

harry-ceub
Автор

Thank you so much.
It really helped me understand the entire concept:)

kreativeworld