Logistic Regression in Python - Machine Learning From Scratch 03 - Python Tutorial

preview_player
Показать описание
Get my Free NumPy Handbook:

In this Machine Learning from Scratch Tutorial, we are going to implement the Logistic Regression algorithm, using only built-in Python modules and numpy. We will also learn about the concept and the math behind this popular ML algorithm.

~~~~~~~~~~~~~~ GREAT PLUGINS FOR YOUR CODE EDITOR ~~~~~~~~~~~~~~

📓 Notebooks available on Patreon:

If you enjoyed this video, please subscribe to the channel!

The code can be found here:

Further readings:

You can find me here:

#Python #MachineLearning

----------------------------------------------------------------------------------------------------------
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏
Рекомендации по теме
Комментарии
Автор

This is the most clear explanation I have seen!! Thank you so much !! :)

ireneashamoses
Автор

Thank you buddy! This gives me a lot of sense after my self study of Machine Learning, and using a inbuild sklearn models.

sushilkokil
Автор

Guys for those wondering the gradient descent used here is same as of linear regression so the answer is the derivative of log loss will have same value as X.t*wieghts + bias including 1/n

armaanzshaikh
Автор

Thanks for the video, now everything makes sense that what is going on in the behind.

TanmayShrivastava
Автор

Excellent video. I really start to have a good understanding of the ML algorithm after I watch your videos.

nadabu.
Автор

there is small typo in sigmoid fuction (1:00)
As-is: h_hat = 1 / (1 * e^ "-wx+b")
To-be: h_hat = 1 / (1 * e^ "-(wx+b)")
Always appreciate you these great videos~

kstyle
Автор

The way you relate Linear Regression to Logistic Regression makes it so clear thank you so much!

kougamishinya
Автор

THANK YOU TO THE MOON AND BACK... BEST EVER EXPLANATION I HAD SEEN

priyanj
Автор

Although in the material a logarithmic loss function was shared but the gradient descent implementation is done using the square/entropy loss function

RaunakAgarwallawragAkanuaR
Автор

I have 2 qs:
1.why we are transposing x(i checked from numpy documentation it is used to change the dimension, but i cannot get the point here)
2.how we r getting the summation without applying np.sum
Can you please ans ?

satyakikc
Автор

thank you !!! your videos help me a lot :)

marilyncancino
Автор

You have explained this very easily. Keep it going on. :)
You saved my Ass!!!

OK-buqf
Автор

Great work bro, I am sure you will reach 100K soon . Best of luck

jayantmalhotra
Автор

i was looking for a basic form logistic regression model using algorithmic modeling. thanks you very much . i like your video

hoami
Автор

thx dude was searching all over the web if you have to put the truncating mechanism with 0.5 into the predict function which is used by gradient descent/ cost f but you successfully showed me that its just for the prediction hypo which is used afterwards

_inetuser
Автор

just started learning this and try running the code on jupyter notebook, It keeps saying no module named logistic regression
it might be stupid one, but please let me know why it's happening

justAdancer
Автор

congrat because lot of people do not do it from scratch

akhadtop
Автор

so to evaluate test data we should not use fit_transform. transform only requires??

dhivakarsomasundaram
Автор

Is learning calculus a pre-requisite to this series -- I am learning, but feel a bit lost when it comes to the implementations because it is difficult for me to understand the underlying mathematical concepts. I do appreciate the videos!

PaulWalker-lkgi
Автор

Sir, my code(sigmoid function) is giving exp overflow error in its iteration.How can I overcome it?

anmolvarshney