Hindi-Naive Baye's Machine Learning Algorithm Indepth Inution- Part 1

preview_player
Показать описание
In statistics, naive Bayes classifiers are a family of simple "probabilistic classifiers" based on applying Bayes' theorem with strong independence assumptions between the features. They are among the simplest Bayesian network models, but coupled with kernel density estimation, they can achieve higher accuracy levels.
----------------------------------------------------------------------------------------------------------------------
Please donate if you want to support the channel through GPay UPID,
-----------------------------------------------------------------------------------------------------------------------
Recording Gears That I Use
---------------------------------------------------------------------------------------------------------------------------
#krish #machinelearningbusted
Connect with me here:
Рекомендации по теме
Комментарии
Автор

Bhai ji idk u know this. But when u teach in Hindi. Your teaching skills r like God level which other youtubers cant match at all But when u speak English teaching skills feels like normal teacher as other persons on youtube. Your Hindi is god like bro.

thepowerofanime
Автор

Please upload the part 2 of this with one normal example and with one real world example. Please upload as soon as possible. And you're doing the great job man.

thelalka
Автор

I think you are an angel sent by God to serve ML enthusiasts

hassanmehedi
Автор

Thank you..u explain all the things in simple way anyway can understand....

Vidi_
Автор

Thank You sir. Your explanation on math behind naive bayes algo. is really awesome.

Sumanth_Siddareddy
Автор

totally the most adbudh video out there!! understood the concepts in one go🤩 what else do we want 🥰 but wait there is an example video too🤑🤑

garvitaggarwal
Автор

Thanks, that really helped. Hindi does make understanding easy

priyankashekhawat
Автор

pLEASE Upload the part2 ..for better understanding.
we are waiting Krish...

prernaawasthi
Автор

Dear Krish, I am a huge fan of your educational content.
At 7:35 explanation of P(G and R) is little wrong, it is the joint probability of both events (drawing green ball and red ball) occurring simultaneously. Please let me know if I have understood it incorrectly. Thanks.

AnikG
Автор

Wowww.. crystal clear explanation.. thank you sir.

poojadixit
Автор

Bays theorem represent conditional probability right, but we have another formula of conditional probability eg, . P(B/A)=P(B and A)/P(A), right, then why we need bays theorem formula ???

kittusingh
Автор

Thank you so much Krish Naik, Very nice video

mahadevbag
Автор

Baye yaar easy formula nikalke naa kama liye

sandipadhikari
welcome to shbcf.ru