Probabilistic ML - Lecture 20 - Gauss-Markov Models

preview_player
Показать описание
This is the twentieth lecture in the Probabilistic ML class of Prof. Dr. Philipp Hennig in the Summer Term 2023 at the University of Tübingen.

Contents:
* Time Series
* Markov Chains
* Chapman-Kolmogorov Equation
* Filtering and Smoothing
* the Kalman filter

© Philipp Hennig / University of Tübingen, 2023 CC BY-NC-SA 4.0
Рекомендации по теме
Комментарии
Автор

Amazingly well explained. What I often fail to get with state space models and their GP representation is how am I supposed to input covariates or "control signals", other signals that control the transfer function. Where do I include these is it the X or is it Y ? Is then the process still markovian or what is happening ? Thanks so much for this video.

matej
Автор

It was mentioned earlier that the product of two GP is another GP only if it is over the same set of variables (x), and that it is some else if it is over two different set of variables (say x and y). Does not this apply to the prediction step at 1:17:11 (from 2nd to 3rd line)?

rolanddeui
visit shbcf.ru