Prior And Posterior - Intro to Statistics

preview_player
Показать описание
In this video, Udacity founder & AI/ML (artificial intelligence & machine learning) pioneer, Sebastian Thrun, gives you a comprehensive introduction to statistics, prior and posterior.

---

Connect with us on social! 🌐
Рекомендации по теме
Комментарии
Автор

Starting :049 there's a very confusing mistake in the video: Instead of saying that Posterior P(C|Pos) was what was written -- it is actually P(C and Pos)
Correcting this video would be helpful in reducing confusion.


yonashlomo
Автор

The author Sebastian Thrun is really a famous Stanford professor. He has mentioned that the formula is incomplete. Both formulas at the end of the clip[ ) need to be divided by a normalizing constant, which is P(C)xP(Pos|C) + P(~C)xP(Pos|~C).

xsli
Автор

Your posterior formulas are wrong in term of Bayer theorem, P(Pos) should be multiplied on the left-hand side of the equations.

ThachDo
Автор

Those of you who're wondering, there's nothing wrong with the formula, the video is just "INCOMPLETE", I've taken their full Stat Course, in the next part of the video they divide the normalizing constant to both posteriors, that is by Probability(Positive Test) ....

And when you multiply P(C) * P(Pos/C), you get P(C, Pos) which is not same as posterior, i.e., P(C/Pos) ; to get P(C/Pos) we need P(C, Pos) / P(Pos), this is same as P(C) * P(Pos/C) / P(Pos) ; for brevity purposes the author has written in a small shortcut formula

Before jumping to conclusions quickly, please understand that, they've put this video just as a "Bite-Sized Trailer" to the course, it's not meant to be a full fledged intro to Bayes theorem, it's Udacity's style to break-up videos into really small chunks ( 4 seconds long even ), they follow everything up in a really small sequences ... It's sere naivety to think a platform like udacity would make such a boo-boo mistake ... Folks, stop thinking that you're a genius for a second and try to reason before judging something

AmithAdiraju
Автор

He made a mistake which was confusing us. The formula should be P(Pos, Cancer) = P(Pos | Cancer) * P(Cancer)

maoxuliu
Автор

Nice introduction but the rest of the video gets a little confusing when g is written instead of 9

jakob
Автор

can we say that prior probability is the same as "unconditional probability" i.e. in the absence of any explanatory variable, according to your example, p(c) = 0.3, i, e, probability of getting a cancer is 30% in absence of all explanatory variables / risk factors as used within "epidemiology", and if this is not the case, how one should differentiate between the two if that even makes sense ?
merci

mikiallen
Автор

Why when professors talk about Bayes Theorum do they always use the cancer example. I wish there was a less depressing example. Give me the probability that I get to have puppies given that my dog is fat. Also, it's wrong.

keyleegibbons
Автор

It's really seems that this video is a part of bigger context, that is not explained here...

igorcryptor
Автор

please do not watch this incomplete video

tosifkhan
Автор

I wouldn't call this an 'intro' to statistics.

jackpainting
Автор

this video has been here for 11 years delete this already

Shamssali
Автор

the way mr sebastian illustrate is really so bad

zaher
Автор

can't understand what you are saying, might want to speak your native language next time because I don't think any English speaking people can understand you.

jackthederp