Logistic Regression - THE MATH YOU SHOULD KNOW!

preview_player
Показать описание
In this video, we are going to take a look at a popular machine learning classification model -- logistic regression. We will also see the math you need to know.

SUBSCRIBE to my channel for more amazing content!
Рекомендации по теме
Комментарии
Автор

This is the most clearly explained and well developed video about the issue I have seen. Most explanations stop with the maximization of the log likelihood function, and I couldn't find how it is maximized until now. I didn't understand a bit, but it's better to know that something is beyond my comprehension than not knowing what it is. Thank you! Subscribed.

UnPuntoyComa
Автор

Very good explanation. Only thing, you're starting really slow, which is perfect, but then when the math gets messy you speed up by 10 times and go by without further explanations. Nonetheless very useful.

oksaubercool
Автор

Love the math full proofs! That stuff is rarely shown even in classes. There is just not enough time to... Great stuff!

Actanonverba
Автор

THE BEST only 9 min to illustrate Logistic Regression! Really appreciate your brilliant work!

kevinshao
Автор

Amazing! Parameter estimation in logistic regression has confused me for so long. I know MLE is used to estimate betas in logistic regression. However, the full math proofs really clarify the way! Really appreciate your video!

qingli
Автор

Oh man, you just saved my course project! Thanks for these great help that really explained how the math works!

jerrylu
Автор

Sometimes the good demonstration is nothing without such one example which is deploying the theory in practice.
Thanks at all :)

mohammedismail
Автор

Love the maths part. Definitely my hero in ML.

chriskong
Автор

finally! No one ever gives any significance into the mathematical part

haifasaud
Автор

I find this a really nice video which strikes a good balance between general principles and details (which can be a very tricky thing to do). I had spent some time reading a textbook about the method and had a few uncertainties. This seemed just the ticket to clarify it all.

christophersolomon
Автор

Excellent! Clear and logical explanation of all the steps involved.

haraldurkarlsson
Автор

Thank you very much !!!! The only guy who could make me understand this subject !!!! You are great

bambinodeskaralhes
Автор

Thank you for crystal clear explanations.

hikmatullahmohammadi
Автор

Excellent job --- congratulations. You sound about 15 years old!!! Even more impressive

professorg
Автор

After going through so many videos, finally understood. Thanks!!

dm
Автор

Thank very much. First time I understand how the coefficients sre calculsted. Great!

stephanschaefer
Автор

I was able to implement this with a minor difference: I used X.T instead of X for the middle term inside the weight update expression

romanwang
Автор

Clear explanation and in deep developed. Charming voice and very good structure. Thanks dude!

Trakushun
Автор

07:54 In gradiant of loss function equation there should be X instead of XT

CraftyChaos
Автор

Very good explanation, i did that step by step derivative with your material can you do video on maths involved for backward propagation

Saravananmicrosoft