Derivative of Cost function for Logistic Regression | Machine Learning

preview_player
Показать описание
We will compute the Derivative of Cost Function for Logistic Regression. While implementing Gradient Descent algorithm in Machine learning, we need to use Derivative of Cost Function. Computing it, can be difficult if you are new to Derivative and Calculus. But going step by step, we can simply compute Derivative of Cost Function for Logistic Regression.

It will help us minimizing the Logistic Regression Cost Function, and thus improving our model accuracy.

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

This is Your Lane to Machine Learning ⭐

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Рекомендации по теме
Комментарии
Автор

If you found this video valuable, then hit the like button👍, and don't forget to subscribe ▶ to my channel as I upload a new Machine Learning Tutorial every week.

CodingLane
Автор

There are not many videos explaining how the gradient is derived through partial derivatives.. Thank you very much for this treasured video!

mikechen
Автор

probably the greatest explanation ever, making it so easy to understand. Thank you bro! We need more people like you

sx.
Автор

A heartfull thankx for u 😭😭 i was gone mad for this ...but finally u were my god @ last thank you ❤️

vikramyadav-fevj
Автор

thanks buddy that was really helpful. i understood the concept in just 10 minutes!

JokerJax
Автор

Wow, I really appreciate your effort in explaining things clearly. I thank you for explaining this concept.

sadaananthanbucheliyan
Автор

Hey! I just found your channel and subscribed, love what you're doing!

I appreciate how clear and detailed your explanations are as well as the depth of knowledge you have surrounding the topic! Since I run a tech education channel as well, I love to see fellow Content Creators sharing, educating, and inspiring a large global audience. I wish you the best of luck on your YouTube Journey, can't wait to see you succeed! Your content really stands out and you've put so much thought into your videos!
Cheers, happy holidays, and keep up the great work!

empowercode
Автор

Finally I understood how it works!! Thank you soo much!

타케시-gw
Автор

Nice one. Towards the end of the derivation of dcost/dw, and dcost/db.... do we need to take the mean of the values?

amadlover
Автор

Nice Video, where can i find derivative functions video? could you please guide me?

ragook
Автор

Thanks so much. Have a small query, isn't derivative of logx = 1/xln(10) ?

thejohnnybhai
Автор

Thanks for breaking down the math proof. Awesome work.

shonendumm
Автор

as always, very clear and concise explanation. Thank you so much

VatsalaNundloll
Автор

Nice, man. Explained step by step. Good job.

hayki_ds
Автор

liked your clear explanation. Loved quickly rubbing the board, saves time. A suggestion: Further you can add how you converted individual element wise to vectorized form. Thank you.

rahulvansh
Автор

The minus sign is applied to whole equation, i.e. - Sum( y log(a) + (1-y)log(1-a). Thus if we take its derivative w.r.t. a we will get -[ y/a + (1-y)/(1-a)]. Where as you have taken -y/a + (1-y)/(1-a). Can you please comment on this?

mukulsharma
Автор

At 5:13, why did you rub T on the W? Can you please explain that? And, I think instead of writing W(Transpose)X you could write as X(transpose)W for easy differentiability. (Both are same)

dharshansagargangadharan
Автор

This is so great Thank you so much for explaining.

thongonk
Автор

it's amazing. thank you so much. You teach these fairly hard concepts very well and clearly.🤩
the sound of the video is a little weak. and I should use hands-free to understand clearly what you are saying. I think it will be much better if you record your videos (not only this one) with a higher sound and also better quality. 👍👍

mohammadferdosian
Автор

Hi Jay.. kudos for making informative videos..could you make a video explaining the derivative of tanh activation function

srihitharavu