10-01. Stochastic processes - Filtrations, martingales and Markov chains.

preview_player
Показать описание
In this video, we define the general concept of stochastic process. We also define the concept of filtration in the context of discrete-time stochastic processes, as well as martingales and Markov chains, and explain their interpretation. These definitions rely on the notions of random variable and conditional expectation defined in previous videos. This is Chapter 4 of my Stochastic Modeling book.
Рекомендации по теме
Комментарии
Автор

Excellent lecture: clear, comprehensive, rigorous. Thank you for making and posting this!

rajanalexander
Автор

I watched this lecture 2 months ago before learning about measure theory because one of my stats courses introduced us to Discrete Stochastic Processes. It was way out of my depth, now it all makes much more sense!

sinqobilebandile
Автор

Has anyone commented on how good the thumbnails looks? And also how you put time and effort into properly categorizing the playlists and counting the total runtime of the videos.

GorgianSoldier
Автор

Superb video! As all your other videos, as well.

Just one remark. Around the 26:00 mark, where you define the concept of 'martingale': I think it should be pointed out that here you do *not* assume anymore that X_i : \Omega -> S. Indeed, to make sense of the concept of 'expectation' in general (and hence also conditional expectation) we have to assume at the very least some kind of linear-algebraic structure (besides topological one) on the set S. So for instance, if we would take simply S = IR (the real numbers), it could work. But also more general, we could (I think) also consider something like [0, +\infty].
Could you specify exactly which assumption(s) you now have on the measurable space (S, IB(S)) (the most general ones, preferably), in order for the definition of 'martingale' to make sense?

Thanks so much!

JoopWilkens
Автор

Dear prof. Lanchier, thank you for your excellent lectures.

I have a few questions. 
(1) Around 1:50 on the slide with the definitions, there is a measurable space (S, B(S)) and you speak of the B(S) as "Borel sigma-algebra". However, that does not seem to make sense, since S is not necessarily a topological space. This caused some confusion. Just to be clear: You just assume (S, B(S)) is a measurable space, right?
(2) In the definition of what ''martingale'' means, it is meant that the random variables X_i are real-valued (possibly in the extended reals even), right? (Because the definition of conditional expectation has to make sense.) Furthermore, I think it must be assumed that X_i is in L^1 for all i? Is that correct?
(3) In the definition of 'Markov chain' it says "for all B \subseteq S", but I assume what is meant is "for all B in B(S)". Am I correct in this?

Thanks a lot and keep up the good work!

HiraBozo
Автор

Hi Nicola, between 32 min to 33 min in the lecture you said you are going to give an example of a process which satisfies the definition of Martingale but the process is not a Martingale at all, I checked the next lecture but I didn't find any counter example, could you tell me If I have missed or something else? Thanks!

irelandrone
Автор

Dear Nicola, Can Omega and S be same? For example when all the random variables are defined on some complete and separable metric space S and taking values in S too?

irelandrone
Автор

Does this playlist cover diffusion processes?

adityaprakash
Автор

This is really the best thing on stochastic processes that I have seen. However I have a question regarding the space Omega of realizations and fixing of a single omega in the definition of the sample path. So the omega actually fixes the realization and the index "i" just corresponds to some instant for which we look up the value on the said realization?

MrRk
Автор

Dear prof. Lanchier, thank you for your excellent lectures.

I have a few questions.
(1) Around 1:50 on the slide with the definitions, there is a measurable space (S, B(S)) and you speak of the B(S) as "Borel sigma-algebra". However, that does not seem to make sense, since S is not necessarily a topological space. This caused some confusion. Just to be clear: You just assume (S, B(S)) is a measurable space, right?
(2) In the definition of what ''martingale'' means, it is meant that the random variables X_i are real-valued (possibly in the extended reals even), right? (Because the definition of conditional expectation has to make sense.) Furthermore, I think it must be assumed that X_i is in L^1 for all i? Is that correct?
(3) In the definition of 'Markov chain' it says "for all B \subseteq S", but I assume what is meant is "for all B in B(S)". Am I correct in this?

Thanks a lot and keep up the good work!~

HiraBozo