Logistic Regression Gradient Descent | Derivation | Machine Learning

preview_player
Показать описание
In this video, we will see the Logistic Regression Gradient Descent Derivation. Logistic Regression Gradient Descent is an algorithm to minimize the Logistic Regression Cost Function. Minimizing the cost function improves the accuracy.

By the end of the video, you will know the why, what and how of Logistic Regression Gradient Descent.

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Start your Machine Learning Journey with Coding Lane ⭐

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Рекомендации по теме
Комментарии
Автор

If you found this video valuable, then hit the *like* button👍, and don't forget to *subscribe* ▶ to my channel as I upload a new Machine Learning Tutorial every week.

MachineLearningWithJay
Автор

Thanks a lot. Your presentation is very sparing and clear.

lucrainville
Автор

The way you teach is amazing. Good job👍👍

mohammadferdosian
Автор

Shall you explain what is learning rate and why we use it

nagadevisanthanakrishnan
Автор

Do you know of any scikit or python packages that perform gradient descent on the loss function? Particularly if you want to optimize more than one parameter? i.e. the inflection point, and the limit of the actual sigmoid curve (if it's not 1)?

johndufek
Автор

just a confusion if logestic reg cant find global minima as (cost function of linear reg) then HOW does linear REGRESSION finds GLOBAL minima in linear regresion..? HOPE YOU GOT
MY QUESTION

manojsamal
Автор

Bhai ye linear k he coz logistic m curve non convex atat h

surajhulke