Markov Chains : Data Science Basics

preview_player
Показать описание
The basics of Markov Chains, one of my ALL TIME FAVORITE objects in data science.
Рекомендации по теме
Комментарии
Автор

Brilliant explanation, I cannot thank you enough. Markov chains are so important, we easily get lost in linear thinking, Markov helps us see probabilities differently. More videos on this topic would be highly appreciated.

diegososa
Автор

Yes, I would like more videos on Markov Chains. Thank you for your videos.

RD-zqky
Автор

Incredibly useful! You achieve to explain difficult concepts in a straightforward and easy way. Thank you for these videos!!

ingenierocivilizado
Автор

How exactly is Sunny W2 0.44? If there is a 0.3 chance of the day after a sunny day also being sunny day how did your probability INCREASE for W2? Seems there is either an error here or something was left out of this explanation?

beyerch
Автор

Sir, I think I am in love with you. How is it possible that you explain everything so simply and clearly, and my teacher sucks at making me understand what a Markov chain is???? Why aren't university teachers taught how to teach and explain the material in such simplicity???? Thank you for your explanations. You have helped me through linear algebra. THANK You were born to teach.

stephanieb.
Автор

this is the first video that I understand the Markov chain thanks you i watch the commercial to pay you thanks a lot

jakecamposano
Автор

You have a wonderful knack for explaining concepts. Thank you

ramankutty
Автор

This is so brilliantly well explained, thank you. I was not getting it at all before.

CleverSmart
Автор

A brilliant into, thank you!

Just a small addition: steady-state vector is (unsurprisingly) an eigenvector of transformation matrix with the corresponding eigenvalue (once again,  unsurprisingly, as the matrix is normalized) of 1.

A video on Monte Carlo Markov Chains would be nice.

nikkatalnikov
Автор

that example you gave at the end i was looking and asking people one by one but null gave me a direct simple answer like what you gave us in that vid thanks a lot for the vid
deserve the subs and likes
keep up the good work

anaibrahim
Автор

I love your tutorial. It is very helpful. Thank you.

minrongwang
Автор

Good video. I did not get how do you get the 0.44 and 0.56. And i suppose that the 0.30 was only an assumption right? Thanks for the video, and if you can explain me that it would be awesome

DanielLopez-mkih
Автор

Just a comment: If we calculate using {S(t)} = [Transition Matrix] {S(t-1)} for W2 then we have to transpose the matrix shown at 3:33. Please correct me if I am wrong.

sumitkumarpal
Автор

Thank you sir! I finally understood the Markov Chain concept now!!

thegoodgorilla
Автор

Thanks so much as always for the great video 🫰It also feels very philosophical to convince someone to not dwell on the past or anxious about the future, “The future is independent of the past given the present.”

xinyuan
Автор

Nice job! clear and simple explanation.

eduardocruces
Автор

Have you done a contrast diagram with past data to find out the acuracy? If so, do you have a video showing it?

kanzzon
Автор

Thank you for the clear explanation! Have a video request for Conditional Random Fields

ledz
Автор

Hello how did you calculate W2? Thank you!

wendycastillo
Автор

Great explanation!! It would be great if you could overview DSGE models (often used in econometrics), they also have steady state.

NicolaevM