Softmax Regression || Multinomial Logistic Regression || Logistic Regression Part 6

preview_player
Показать описание
Softmax Regression, also known as Multinomial Logistic Regression, is an extension of logistic regression to handle multiple classes. It computes the probabilities of each class and selects the one with the highest probability as the predicted class. Softmax activation ensures that the predicted probabilities sum up to 1, providing a robust framework for multiclass classification tasks.

============================
Do you want to learn from me?
============================

📱 Grow with us:

⌚Time Stamps⌚

00:00 - Intro
00:55 - Softmax Regression
08:20 - Training Intuition
14:53 - Prediction
19:58 - Loss Function
34:30 - Softmax Demo in SKlearn
Рекомендации по теме
Комментарии
Автор

No one explains such things in deep and in an easy manner anywhere in a course or on youtube. Thank you so much sir for this level of explanation.

raghavagrawal
Автор

I subscribed, this channel is
I've not seen a single other channel like this, its been more than 4 months of learning I just wish I would've found this earlier!

varunahlawat
Автор

Thank you so much sir for this level of explanation.

tushargogiya
Автор

This video was really hard to find, I was about to give up searching the mathematics behind multinomial logistic regression! Thankyou!

varunahlawat
Автор

Your explanation are gold sir!
Thank you for video!

rohitekka
Автор

Thank youuuu for this easy explanation . You are on top...

Pipython
Автор

thanks Sir please Complete Mathmatical Part also

katw
Автор

But sir in previous video, you used LogisticRegression() class on mnist dataset which has 10 labels ( output ). But here u say that LogisticRegression is only able to perform Binary Classification Plzz

harsh.gupta
Автор

Sir please algorithm continue karo xgboost and gboost

prasadbhandarkar
Автор

Hi why did we just use two columns for x, when we had 4? Please answer

meetbhadeshia
Автор

Bhaiya
I run this but 😶😶😶😶
from matplotlib import axis
import matplotlib.pyplot as plt
from mlxtend.plotting import plot_decision_regions

plot_decision_regions(X, y, clf, legend=2)
# Adding axes annotations
plt.xlabel('sepal length [cm]')
plt.ylabel('petal length [cm]')
plt.title('Softmax on Iris')
plt.show()

getting error. Not able solve this.

TypeError: axis() got an unexpected keyword argument 'y_min'

VishalKumar-choj
Автор

Sir, I have a doubt since the dataset is not provided in the Github, I tried making classification model using the command:-
X, y = make_classification(n_samples=100, n_features=2, n_informative=1, n_redundant=0,
n_classes=3, n_clusters_per_class=1, random_state=41, hypercube=False, class_sep=20)

But seems like it gives me error, how can I deal with this?

eeshananand
Автор

Dont we use LabelEncoder for output col? as mentioned by you in some other video where you also showed the official scikit learn doc

ShreyeshSharma
Автор

brudda please make whole video in english it’s hard to follow when you keep switching

Philgob
Автор

can u please provide your onenote it will be easier for us to revise

sans_moi
Автор

nitish sir, species got three classess in it but why there is only 1 feature is there after ohe on it, in training module said after ohe if three classess were there so three column will be there kindly confirm that thing

fullthrottlevishal
Автор

while doing logistic regression for each label, we're using sigmoid function right? I mean to ask whether while we train we don't use softmax ??

abhishekkukreja
Автор

31:40 formula will apply at one row ?? so X11+ X12+

oprotectordm
Автор

Anyone can have implementation of logistic regression for multiclass

mohammadali-yyop
Автор

have you post math intituion or scratch code if yes pls give me link

tushargogiya