filmov
tv
Intro to Markov Chains & Transition Diagrams
Показать описание
Markov Chains or Markov Processes are an extremely powerful tool from probability and statistics. They represent a statistical process that happens over and over again, where we try to predict the future state of a system. A markov process is one where the probability of the future ONLY depends on the present state, and ignores the past entirely. This might seem like a big restriction, but what we gain is a lot of power in our computations. We will see how to come up with transition diagram to describe the probabilities of shifting between different states, and then do an example where we use a tree diagram to compute the probabilities two stages into the future. We finish with an example looking at bull and bear weeks in the stock market.
Coming Soon: The follow up video covers using a Transition Matrix to easily compute probabilities multiple states in the future.
0:00 Markov Example
2:04 Definition
3:02 Non-Markov Example
4:06 Transition Diagram
5:27 Stock Market Example
COURSE PLAYLISTS:
OTHER PLAYLISTS:
► Learning Math Series
►Cool Math Series:
BECOME A MEMBER:
Special thanks to Imaginary Fan members Frank Dearr & Cameron Lowes for supporting this video.
MATH BOOKS & MERCH I LOVE:
SOCIALS:
Coming Soon: The follow up video covers using a Transition Matrix to easily compute probabilities multiple states in the future.
0:00 Markov Example
2:04 Definition
3:02 Non-Markov Example
4:06 Transition Diagram
5:27 Stock Market Example
COURSE PLAYLISTS:
OTHER PLAYLISTS:
► Learning Math Series
►Cool Math Series:
BECOME A MEMBER:
Special thanks to Imaginary Fan members Frank Dearr & Cameron Lowes for supporting this video.
MATH BOOKS & MERCH I LOVE:
SOCIALS:
Комментарии