Live 2020-05-18!!! Bayes' Theorem

preview_player
Показать описание
Today we're going to talk about Bayes' Theorem. This is one of those fundamental statistical concepts that underlies how statistics works. Understanding Bayes' Theorem is a key stepping stone towards understanding Bayesian statistics.

For a complete index of all the StatQuest videos, check out:

If you'd like to support StatQuest, please consider...
...or...

...buying a StatQuest Study Guide...

...a cool StatQuest t-shirt or sweatshirt:

...buying one or two of my songs (or go large and get a whole album!)

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
Рекомендации по теме
Комментарии
Автор

You are on of the greatest channel here on YouTube.👍👍👍👍

mohammadmousavi
Автор

@23:46, shouldn't the p(no love C) be all the yellow AND red dots?

IshwarHosamani
Автор

I feel learning Bayesian inference is particularly satisfying in stats and this really brings power to this simple equation.

taotaotan
Автор

Perfect timing. I was studying Machine Learning at the time this was streamed. :D

samson
Автор

"Oi brazil" got me dead AHAHAHAHAH Big BAM from Portugal! Underrated explanation, deserves way more than 14k views...

joaojulio
Автор

I wish law of total probability was introduced to show how the denominator can be split to multiple terms and that one of these terms is exactly the numerator. That would be clearer than just portraying the denominator as 'scaling'. This was a key insight in helping me think intuitively when learning Bayes Theorem.
At 29:00 you mentioned its possible for likelihood to go over 1, and this denominator makes sure it never happens, however isn't the denominator a probability which is 0-1? How can a denominator capped at 1 scale down something bigger than 1 in numerator? Or this whole video is for discrete/multinomial naive bayes and likelihoods only appear in continuous gaussian naive bayes? How would the denominator look like in gaussian NB? (i couldn't find any mention of denominator in the gaussian NB video)

Han-veuh
Автор

Hi Josh, I tried to connect to starquest.org to buy the study guides but it gives error:( can you kindly let me know where to buy them:) thankssss

PS-jrvi
Автор

What a nice timing! My exam is early next month and this is one of the chapter I am weak in!

shieldofchaos
Автор

It's so cool to see your "BAM's" live! - Trinidad and Tobago

J
Автор

After looking for an eli5 explanations and getting all eli25 with math majors answers, I can finally rest my search here. As always Josh saves the day! <3
P.S. I'd like to request an eli5 on ZDDs(Zero-Supressed Decision Diagrams, I understand has more to do with CS than stats) and their applications to statistics.

demondaze
Автор

Each Bam of yours makes statistics sexier and sexier :D Thanks a lot Josh! You are the best!

forestsunrise
Автор

Always heard ur voice..first time see ur face..big fan from malaysia

FaizFinance
Автор

Hi Josh! Thank you for your hard work and for this video!

gloriouz
Автор

Thanks, Josh!!!
BTW, how did you manage to pause your aging process?

ksrajavel
Автор

Love the video, it gave some very useful insights! Please explain MCMC in the future!

lucha
Автор

Is there a video for hypothesis testing...?

DreamCodeLove
Автор

The PayPal link on your channel has the name Laurence Torr... Just wanted to check if a tip would reach you?

JscWilson
Автор

It's ok that Stat Guy didn't have an accent. I can just imagine that Josh Starmer is telling us about a conversation that he had with Stat Guy.

gigz
Автор

Got the study guides, love it! Please do more, thank you Josh!

julieirwin
Автор

Hey Josh, if I may make a request: can you do a video explaining the difference between how bayesian and frequentist approach to hypothesis testing differs? It looks like you're neck deep in neural networks but if I may put this in the queue. Much appreciated!

paulsr