Machine Learning Basics: Confusion Matrix & Precision/Recall Simplified | By Dr. Ry @Stemplicity

preview_player
Показать описание
This tutorial covers the basics of confusion matrix which is used to describe the performance of classification models.

The tutorial will also cover the difference between True Positives, True Negatives, False Positives, and False Negatives which can be described as follows:

• True positives (TP): cases when classifier predicted TRUE (they have the disease), and correct class was TRUE (patient has disease).

• True negatives (TN): cases when model predicted FALSE (no disease), and correct class was FALSE (patient do not have disease).

• False positives (FP) (Type I error): classifier predicted TRUE, but correct class was FALSE (patient did not have disease).

• False negatives (FN) (Type II error): classifier predicted FALSE (patient do not have disease), but they actually do have the disease

The tutorial will also cover the difference between classification accuracy, error rate, precision and recall. These metrics can be summarized as shown below:

• Classification Accuracy = (TP+TN) / (TP + TN + FP + FN)
• Misclassification rate (Error Rate) = (FP + FN) / (TP + TN + FP + FN)
• Precision = TP/Total TRUE Predictions = TP/ (TP+FP) (When model predicted TRUE class, how often was it right?)
• Recall = TP/ Actual TRUE = TP/ (TP+FN) (when the class was actually TRUE, how often did the classifier get it right?)

If you want to learn more, here’s a link to my new machine learning Classification course on Udemy:

Here’s a link to my new machine learning regression course on Udemy:

Subscribe to my channel to get the latest updates, we will be releasing new videos on weekly basis:
Рекомендации по теме
Комментарии
Автор

This couldn't be explained as easier as this, thanks

khaibaromari
Автор

Well explained with proper examples. Thanks for the video Dr.Ryan

Rishu_Dakshin
Автор

Excellent teaching. Thank you so much.

dr.sangramsinha
Автор

Beautiful explanation. Great example as well!

Officialjadenwilliams
Автор

This is the best one I have found so far.

anujonthemove
Автор

I almost cried when I didn’t understand this. All I needed was weed and this video. Thanks.

Jdjdhsgxuxu
Автор

Great explanation Dr. Ryan. i took your course on udemy and absolutely loved it. You are a great teacher :)

piyushganar
Автор

Well explained. Thanks so much for this explanation. I used it in my final AI thesis project.

xpertstrategist
Автор

Thanks a lot sir.am confused in class.but now I clearly understood

ambikashetty
Автор

Hi,

Content is good. however i feel at 2:46 there seems to be incorrect.. when we are considering + as cancer(1) and - as no cancer(0). ideally that is 1-0 which should be False Negatives and 0-1 which should be False Positive.. please correct me if i am wrong.

arjundev
Автор

An amazing explanation. Thanks for clarify the concepts

javier-medel
Автор

You have interchanged Type 1 error and Type 2 error. Type 1 error is more serious. Hence, you control is by providing an error tolerance denoted by alpha

sidharthmanne
Автор

This was a very nice video to recall Confusion Matrix topic. Please add AUC ROC curve topics explanation.

Holasticlogger
Автор

pls i will like to join your course in udemy "under the hood" but cant find it

nkechiesomonu
Автор

i think you mean the actual true positive when you describe the recall not just the actual true its great explain but this part may be confusing

wanys
Автор

Can I assume that the precision measure is less informative because of the small numbers in the TP & FP?

liornisimov
Автор

Any paper that can use as literature for perfomance measures > confusion matrix?

Tommy-zbsi
Автор

Hey dear i also watched your course on udemy on MATLAB, the lectures of superb i hope this content is also good

sachindubey
Автор

Recall is a weird word to use. Recall = Sensitivity. Sensitivity is also a weird word to use. But text books are text books.

nkristianschmidt