Logistic Regression CLASSIFIER has REGRESSION in its name? #InterviewQuestions #AVshorts

preview_player
Показать описание

Stay on top of your industry by interacting with us on our social channels:
Рекомендации по теме
Комментарии
Автор

Wow you explain this in such a simple way in just a minute ❤

Vinit_Gambhir
Автор

Basically in normal regression, we have a linear model that outputs values in the continuum. In logistic regression, since the dependent variable can only take two values. We want to find a transformation that allows the linear model to output values from 0-1 instead. Then we can use a threshold, say 0.5 to classify it as 1 or 0. This transformation (sigmoid function) is derived from its inverse, which is the logodds. Log odds is just the log of odds, which is used in quantifying probability (instead of saying 80%) you can say it happens 4 times as frequent instead. We apply log to make the range the same as the continuum.

christophersoo
Автор

I have a question, instead of optimising parameters by applying gradient descent on the logloss function (or cross entrophy) why cant we just apply gradient ascent on the loglikelihood function instead?

christophersoo
Автор

My ma'am not gave me marks in that continuous point she says it doesn't accept continuous value...

shushankpanchal