#45 Bayesian Belief Networks - DAG & CPT With Example |ML|

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

The explanation was so simple and clear to understand ! Hatsoff to your great work, I'm having my exam in 6 hours and my anxiety is freed due to you ma'am !

uchihamadara
Автор

A patient visits a doctor suspecting that he may have lung cancer after suffering short of breath. The doctor knows that besides lung cancer the patient might be suffering from tuberculosis. He also notes that smoking and exposure to pollution are key causes of both diseases, but taking an X-ray would indicate that the patient has cancer or tuberculosis. Given that the P (patient is a smoker) =0.2 and P (patient was exposed to low pollution)=0.8, Using the Bayesian networks following all steps determine the accuracy of the output from the X-ray results.

Junienyaru
Автор

Thank you mam this topic is very frustrated for me and you clear my doubts in just 15 min 🙏🙏🙏🙏

Haven_Hue
Автор

Amazing lecs . I'm effortlessly clear with concepts nw

Forever._.curious..
Автор

Ur explanation is awesome. First time i am listening

sridharmakkapati
Автор

Mam Please complete 5th chapter as per jntuh syllabus, we have exam on Tuesday

kavyasri
Автор

Complete 3 units mam and try to complete 4 or 5 for the safe side....
Thank you mam

pavankalyan
Автор

Given a discrete 𝐾-class dataset containing 𝑁 points, where sample points are described using 𝐷 features with each feature capable of taking 𝑉 values, how many parameters need to be estimated for Naïve Bayes Classifier?

a) V^D K
b) K^V^D
c) VDK
d) K(V+D)


please answer to this gate sample question mam.

pooraniayswariya
Автор

Can u explain these topics
Advanced Knowledge Representation and Reasoning: Knowledge Representation Issues, Non- monotonic Reasoning,
Other Knowledge Representation Schemes.
Reasoning Under Uncertainty: Basic probability, Acting Under Uncertainty, Bayes’ Rule, Representing
Knowledge in an Uncertain Domain, Bayesian Networks.

maheshbabupolanki
Автор

Great explanation! However, going by the logic for calculating only parent probability, we also need calculations for bark & hide. Just a thought.

bingochipspass
Автор

i think p(A/B, E) instead of p(A/^B, ^E) in example if i am wrong please mention why so mam!

kumbhanagarjuna
Автор

Can you show how did you come up with probability of 9, 3, 18 & 18?

Saisharath
Автор

I have exam at 24th august machine learning jntuh syllabus please upload complete syllabus topics

katkamshravani
Автор

UNIT - III
Bayesian learning – Introduction, Bayes theorem, Bayes theorem and concept learning, Maximum
Likelihood and least squared error hypotheses, maximum likelihood hypotheses for predicting
probabilities, minimum description length principle, Bayes optimal classifier, Gibs algorithm, Naïve
Bayes classifier, an example: learning to classify text, Bayesian belief networks, the EM algorithm.
Computational learning theory – Introduction, probably learning an approximately correct hypothesis,
sample complexity for finite hypothesis space, sample complexity for infinite hypothesis spaces, the
mistake bound model of learning.
Instance-Based Learning- Introduction, k-nearest neighbour algorithm, locally weighted regression,
radial basis functions, case-based reasoning, remarks on lazy and eager learning.

govardhanreddy
Автор

Above mentioned examples is enough to wrote in exams

prakashp
Автор

Hi Trouble free, all your videos are simple and clear, I have exam in next month so please explain Gradient Ascent Training of Basian Network, pls

vindyasemith
Автор

Ur explained very well mam Tqu so much

Anand____
Автор

Hi! How do you took the values for that dog bark and alla continue problems.

c.meghana
Автор

Please complete 3rd unit atleast by Tomorrow mam
We have exam on 24th.
So that atleast we'll pass

kondangantishruthiigoud
Автор

maximum likehood for predicting probabilites please explain

varshithamunaganuri