Introduction to Gaussian processes

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

I appreciate how thorough this video was. Many tutorials on GPs tend to handwave a lot of the mathematical details of Gaussians and use sloppy notation (which is a huge problem with machine learning education in general, in my opinion).

jonathancangelosi
Автор

I really appreciate for this. Been studying GP with lots of confusion. This was light for me.

franard
Автор

I loved learning that a diagonal noise term can help with ill-conditioned matrix inversions.

betube
Автор

Thank you for these classes, very helpful - and probably COVID for enabling them to go to YouTube :-)

hossanatwino
Автор

I don't see the slides on the website. the speaker says they are there ..

microndiamondjenkins
Автор

thank you so much for this wonderful video. In most of your figures, you have about 10 different colors that are moving along the x-axis. On each slice (a vertical line at x=x_j lets say j is 5) we have 10 points in 10 different colors that are normally distributed while they are correlated based on a kernel with the 10 points in x_i {i=0, 1, .., 5, .., n}, and lets say n is 20. My question is how do you generate this point? In total we have 10 (colors) * 20 (n) = 200 points which have to satisfy 2 conditions: 1) being normally distributed at each section 2) following the correlation based on kernal. Thank you

mohsenvazirizade
Автор

Any one who can say something about the kernel k(x, x'). Here what does it mean by x and x' I thought they are like two inputs of a random variables that gives a value, but I saw something like a vector k(x, x')=x^Tx'???. Does it mean that x is the observed points and x' is point for prediction?

muluegebreslasie