The Math Behind Bayesian Classifiers Clearly Explained!

preview_player
Показать описание
In this video, I've explained the math behind Bayes classifiers with an example. I've also covered the Naive Bayes model.
#machinelearning #datascience

For more videos please subscribe -

Support me if you can ❤️

The math behind GANs -

Source code -

3blue1brown -

Facebook -
Instagram -
Twitter -
Рекомендации по теме
Комментарии
Автор

Dude.. I lost count of the videos I watched to understand this but lastly, after seeing your video the struggle ended. Thank you so much!

pradyumnabada
Автор

How he manage to explain something that a 1-hr lecture couldn't! Thanks mate

hayleyH
Автор

'Clearly Explained' - and it actually was. Thanks man

BrianAmedee
Автор

One of the best explanations I've ever seen!

bluestar
Автор

This was a very clear explanation indeed. Thank you!

jaster_mereel
Автор

Been struggling to grasp this topic but i finally hit that Eureka moment with this video, .Thank you so much

uncaged
Автор

HUGE thanks for perfectly delivering the whole concept in one video bro!!

sye
Автор

one of the best explanation of this topic. Thanks man

hussamcheema
Автор

In the last part of the video you said we can fit a known distribution to a continuous set of data. However, you continued to then write that the probabilities can be calculated by taking the product of the pdf evaluated at different values of the feature and label. The pdf does not provide probabilities however, as it needs to be integrated to inform one of the probabilities of an event. This part of the video seems imprecise.

However, the video in general was great. Thanks.

radoyapanic
Автор

Thank you very much for the video. Clearly explained indeed, the only part I couldn't get completely was the discretization.

miusukamadoto
Автор

Very nice explanation and perfect illustrations!!

jefersondavidgalloaristiza
Автор

You did good my friend. I'm glad I came across this video

RayRay-ytpe
Автор

The explanation is so cool! But it would be even cooler if you added some examples with continious features and fitting a distribution, this part wasn't so clear...

high_fly_bird
Автор

9:37 you made conclusion based on P(X=[0, 2] | Y), I think the correct way is to calculate P(Y|X=[0, 2]). In case P(Y=1) is very small, the answer can be Y=0.

noname-anonymous-vc
Автор

LOVED IT!!!
Awesome Explanation! Can't thank you enough...

EduAidClassroom
Автор

That was great! I'm really glad that I found your channel. Thanks a lot 👍👍

parisaghanad
Автор

It was clearly explained as mentionned in the title. Thanks a bunch !!!

sopegue
Автор

If I search for any ML Algorithm I just first check your channel If you have created the video on the same... You are my first preference for ML/DL Algo Explanation. Just a request please make a video on Deep Learning Algorithm too like CNN, RNN & LSTM "from scratch". It will really help people who want to become practitioners in AI like me.

PritishMishra
Автор

sir please more lectures.
I am seeing after too days later your lectures
made some advance NLP and CV lectures or AI lectures thanks

muhammadzubairbaloch
Автор

Well explained, a quick revision for Naive bayes. I forgot why it was called Naive until i watched this video 😂😂

DANstudiosable