#43 Bayes Optimal Classifier with Example & Gibs Algorithm |ML|

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

what you explained for optimal classifier is wrong,
Probable classification of the new instance is obtained by combining the predictions of all hypotheses, weighted by their posterior probabilities is the concept for optimal classifier.
So the formula you used and explained for Optimal classifier is the formula and explanation of Naive bayes

udaysai
Автор

Mam pls cover this topics our exam is on 23 april
Convergence and local maxima
Representation power of feed forward networks
Hypothesis space sreach and inductive bias
Hidden layer representation
Generalization
Overfitting
Stopping criterion
And an example - face recognition

bangtangirl
Автор

your explanation is exceptional but please change that intro music . it feels like I am watching a cartoon. hope you don't mind thank you

amruthmanda
Автор

Got the concept anf had to say your english fluency is amazing 😅 trying to catch up skills like you for my interviews !! Any tips

bhushandatre
Автор

UNIT - III
Bayesian learning – Introduction, Bayes theorem, Bayes theorem and concept learning, Maximum
Likelihood and least squared error hypotheses, maximum likelihood hypotheses for predicting
probabilities, minimum description length principle, Bayes optimal classifier, Gibs algorithm, Naïve
Bayes classifier, an example: learning to classify text, Bayesian belief networks, the EM algorithm.
Computational learning theory – Introduction, probably learning an approximately correct hypothesis,
sample complexity for finite hypothesis space, sample complexity for infinite hypothesis spaces, the
mistake bound model of learning.
Instance-Based Learning- Introduction, k-nearest neighbour algorithm, locally weighted regression,
radial basis functions, case-based reasoning, remarks on lazy and eager learning.

govardhanreddy
Автор

Maam we have Compiler Design exam on 21st August (jntuh) badly need notes, Bharat Institute of engineering and technology

venkattramana
Автор

while adding 0.031+0.08571 how it will be 0.27
the result must be 0.117

dipakaryal
Автор

Ma'am ig this concept is Naive bayes classifier. Because Bayes optimal classifier has slightly different concept. Please re-check the topics =)

kumarrishikesh
Автор

Maam pls complete the syllabus as soon as possible we have ml exam on 23rd .JNTUH

Karthik-ilyu
Автор

Make videos on Genetic algorithms also :)

sanathgattu
Автор

Mam we have CD exam on 20/2/24 plz make playlist 2, 3, 4, 5 chapters plzz mam❤️❤️

trending
Автор

Hlo mam.Are u from ANITS ??Becos u r explaining the exact order of our syllabus and question paper question.
Really helped a lot . 🙏

videos
Автор

Medam i need pdf of ml subject pdf send cheyyara please

TeriMusic
Автор

THIS is NOT bayes optimal classifer, but is NAIVE BAYES classifier. I spent hours understand where I'm going wrong. Please check the formula before posting.

hell-o
Автор

mam can i write this ans of bayes optimal and just this info of gibbs if it comes in the exam??

k.vamshi
Автор

Make video on t square likelihood ratio criterion with example

rajalaxminayak
Автор

The total value you counted is wrong there .. 0.031+0.87= 0.11 something 😓

duttebasa
Автор

What if total is different in both cases

chenamonijhansi
Автор

Please explain Gibbs algorithm properly

ninjaasmoke
Автор

Ma'am can you please speak little bit slower its too fast 🙂

DeepakKumar-skzg