A friendly introduction to Bayes Theorem and Hidden Markov Models

preview_player
Показать описание
40% discount code: serranoyt

A friendly introduction to Bayes Theorem and Hidden Markov Models, with simple examples. No background knowledge needed, except basic probability.
Accompanying notebook:
Рекомендации по теме
Комментарии
Автор

Happy I found this video.. even though it was rainy outside

pauldacus
Автор

Usually Bayes Theorem and HMM are nightmare to even researchers. In this video these nightmares are made like child's play. I'm highly thankful for this service you are providing to the academic community- teachers, researchers and students. Keep it up Luis Serrano and hope to see many more plays in future!!!

csejpnce
Автор

your are one of those rarest breed of gifted teachers

somdubey
Автор

Your video tutorials are a great breakdown of very complex information into very understandable material. Thank You. It would be great if you could make a detailed video on PCA, SVD, Eginvectors, Random Forest, CV.

codebinding
Автор

wow. perfect explanation . Even a kid can learn HMM by watching this video

simpleprogramming
Автор

The most exciting thing I found in your video is that most of them is a one-stop solution for dummies like me, without the need to go to other 100 places to find 50 missing info pieces. Many thanks !

chenqu
Автор

Thank you so much for this great video Luis. I am a Udacity alumni myself. I have watched & read many videos and articles on Bayes & HMMs, but your video by far is the best. It explains all the steps in the right amount of detail & does not skip any steps or switch examples. The video really helped solidify the concept, and giving the applications of these methods at the end really helps put them in context. Thank you again very much for your information & helpful video.

BabakKeyvani
Автор

Hi Luis, thank you for your friendly introduction. When I was studying on an assignment and trying to implement Viterbi method following your explanation, I noticed that there may be some mistakes on your calculations. You calculated best path starting from the beginning (from leftmost side) and select the weather condition (sunny or rainy) with the max value. However, I am not sure if that this is the correct way to apply Viterbi. You don't mention anything about backpointers.

I reviewed HMM chapter of Speech and Language Processing by Dan Jurafsky. Here, it is stated that to find best path we should start from the end (from rightmost side) First we should select the weather condition with the max probability (that is actually the last node of our visiting path. we find the full path in reverse order) Then we should do a backward pass and select the weather condition which maximizes the probability of the next condition that we have just selected instead of just by looking for the max probability among all conditions at that observation time. We continue this process until we reach the beginning.

Two things to emphasize;
1- We go backward. (from end to start)
2- We don't just select the weather conditions with maximum probabilities at specific observation times, instead we select the max one only once at the beginning and then select the conditions that maximizes the one that comes after it, like a chain connection.

If I am wrong, please enlighten me.
Best.

sametcetin
Автор

You have just saved me, this was such a clear breakdown of Bayes Theorem and HMMs, and exactly what I needed at the 11th hour of a project I'm working on!

Slush_
Автор

I was quite tensed when my supervisor pointed out to me that my master thesis should incorporate HMM. This video is my first introduction to HMM. You chased my fears away with your simple explanation and tone. Forever grateful

AbeikuGh
Автор

I have a midterm in 8 hours and this video is the only thing that's really helped me so far. Cleared up all my confusions during 8 lectures in 32 minutes. Thank you so much, from the bottom of my heart.

me-zbqm
Автор

Omg. You just replaced an entire dry, non-understandable book for bioinformatics! I can’t thank you enough! It’s so easy!

LizaBrings
Автор

this example made everything crystal clear, I have an exam tomorrow on HMM. Initially, I was anxious but after this video I'm sure I can solve any problem.
Thank you very much, sir.

aatmjeetsingh
Автор

I wish professors would just show this video in lectures... You are great at making these animations and your speech is perfect. Thank you!

kassymakhmetbek
Автор

This is the best video that explains HMM so simply to someone who doesn't have a computer science background. Godspeed to you

mrinmoykshattry
Автор

Your videos are amazing! As someone who hasn't looked at calculus in 20 years, I find these "friendly introduction" videos extremely helpful in understanding high-level machine learning concepts, thank you! These videos really make me feel like this is something I can learn.

MasterRayX
Автор

This is the best description of this topic I have ever seen. Crystal clear! True knowledge is when you can explain a complex topic as simple as this!

shuchitasachdev
Автор

I am a bio-organic chemist and we have a bioinformatics course which included Hidden Markov Model and your video helped me to learn the idea without immersing myself deep into mathematics. Thanks ...

muhammadyousuf
Автор

I wasted the whole day understanding HMM model by watching useless youtube videos, untill I saw this. Thank you so much for this video. It is so simple and so intuitive. So very thankful to you :)

pratiksharma
Автор

OMG! you are amazing! I consider myself as a information theory guy and should know this pretty well. But I can never present this idea as simple and easy understanding as you did! Great great job! I will for sure check around your other videos! Thank you!

changyulin