Accuracy and Confusion Matrix | Type 1 and Type 2 Errors | Classification Metrics Part 1

preview_player
Показать описание
In this video. we'll explore accuracy and the confusion matrix, unraveling the concepts of Type 1 and Type 2 errors. Join us on this journey to understand how these metrics play a crucial role in evaluating the performance of classification models.

============================
Do you want to learn from me?
============================

📱 Grow with us:

⌚Time Stamps⌚

00:00 - Intro
00:46 - Accuracy
06:25 - Code Example using SKL
08:35 - Accuracy of multi-classification problem
10:18 - How much accuracy is good?
13:16 - The problem with Accuracy Score
15:30 - Confusion Matrix
23:15 - Type 1 Error
25:53 - Confusion Matrix for Multi Classification Problem
30:03 - When is accuracy misleading?
Рекомендации по теме
Комментарии
Автор

Nitish Sir is Bhagwaan of Machine Learning realm.

No one comes near you sir . Top class teaching skills.

jinks
Автор

a situation where accuracy score can mislead us is when the cost of false positives and false negatives are different. For example, in medical diagnosis, a false negative (saying a patient is healthy when they are actually sick) may be much more costly than a false positive (saying a patient is sick when they are actually healthy). In this case, a model that minimizes false negatives may be more desirable, even if it has a lower overall accuracy.

talibdaryabi
Автор

no words ♥Sir please complete all 100 days

thatsfantastic
Автор

Thanks for the amazing explaination, just a small correction in sklearn the order of tp, tn, fp, fn is different for binary classification then what you quoted.

in sk learn it is:
"By definition a confusion matrix C is such that Ci, j is equal to the number of observations known to be in group i and predicted
to be in group j.
Thus in binary classification, the count of true negatives is C0, 0, false negatives is C1, 0, true positives is C1, 1 and false positives is
C0, 1."

trayambrathore
Автор

Thanks for this wonderful playlist sir. Your content is always on point and easy to understand. I just have a doubt about multivariable classification models. Like you showed at 27:23 where we our model has predicted 0 items for Type 1 but it is for Type 0. I couldn't get that part. Can someone please explain?

akshaysingh
Автор

The way you teach is literally awesome. Thank you so much for this lecture.

WinYourSoul
Автор

Always wondered by ur thorough knowledge & way of explaining it. U explains everything very deeply & in a anyone will understand. Thanku so much. No. 1 Campus X.

siyays
Автор

is there a way to know for which 6 people the model did mistake

nishantdey
Автор

Well explained!! Thanks much for this tutorial!

harikrishna-harrypth
Автор

what if I choose (Heart disease = 0) and ( Not a Heart Disease= 1) ?
In this situation my confusion matrix
answer will be different from you.
so when should we choose 1 or 0 value ?

Shashank_Shahi
Автор

Great Explanation !@ Thankyou so much Sir..

Cric_best_ininings
Автор

Kindly correct the confusion matrix 19:10 it looks like you have misplaced the FP & FN.

devenderdaila
Автор

I think confusion matrix is not correct, actual along the col, predicted along the row
from sklearn.metrics import confusion_matrix
y_true = [1, 0, 1, 0, 0, 1]
y_pred = [0, 0, 1, 1, 0, 1]
confusion_matrix(y_true, y_pred)
array([[2, 1],
[1, 2]], dtype=int64)
y_pred = [0, 0, 1, 1, 0, 0]
confusion_matrix(y_true, y_pred)
array([[2, 1],
[2, 1]], dtype=int64)

Look at this example...

Let me know if i am wrong...

YogaNarasimhaEpuri
Автор

sir apne t-shirt bhi vahi pehnee hey " Whats the score"

tanmayshinde
Автор

now i understand why name of confusion matrix start with confusion😆😆

AyeshaAyesha-ii
Автор

Why LR is called regression, even though it is classification algo?

vikashkumar-cree
Автор

Confusion matrix jaisa name waisa kaam🥲

flakky