Coding gaussian process regressors FROM SCRATCH in python

preview_player
Показать описание
In this video we will implement a Gaussian process regressor with squared exponential kernel in Python using numpy only and code several interactive plots to visualize it. Feel free to adapt the code and get your own hands on GPRs!

0:00 Intro
0:43 Preliminaries
1:35 Implement squared exponential kernel
3:48 Implement GPR
12:48 Plot GPR
15:05 Draw random functions from GPR
17:08 Add points iteratively
18:07 Change parameters

-----------------------------------------------

Рекомендации по теме
Комментарии
Автор

Out of what feels like two dozen tutorials and explanations i found this is actually what made me understand it

reallyanotheruser
Автор

This is a solid gold for me. I like learning anything in a visual way which I can interact with it. Thanks for your effort.

youngzproduction
Автор

This is amazing! Thank you for providing this approach, it really helped me understand GPR a lot better

OliverJanShD
Автор

Extremely helpful for understanding GPRs, thank you!

rossci
Автор

Very cool and easily digestible content, loved it!

umutkorkut
Автор

This is so underrated. Good job anyway

satadrudas
Автор

Excellent material you provided here; I just came back to the video to congratulate your efforts on the content hahaha Thank you, man!

gbm
Автор

You are amazing!!!....thanks for helping me in studying for my Green Light meeting which is due in less than 2 days!!!!..this video gave me a great confidence!!!!...once again thank you very

muhammadrayyan
Автор

Excellent, the best video on gaussian process regressors

amothe
Автор

underrated video! Thanks for making this great content. This helped me quite a bit as I prepared a lecture on this topic for my materials science students.

TaylorSparks
Автор

This was really helpfull for me in understanding GP thankyou so much for your efforts

eva__
Автор

Very nice video - thank you very much :D

icoop
Автор

Absolutely Mindblowing Work! Keep it up. May Allah bless you. 🙂

komuna
Автор

Great video. Would you do a follow-up on hyperparameter optimization using marginal log-likelihood in the loss function?

Also, a visualization example using multi-input GPs would be interesting as well. Or multi-output GPs.

swisscheese
Автор

Well done!
# Writing comments would be helpful for beginners
if it is put in a context of solving a problem/examples :
it will be more useful.

Thanks!

azd.zayoud
Автор

Can we have access to the notebook file?

pouyaaghaeipour
Автор

Amazing 👌🙏👌
Access to the notebook would be great 🙏🙏🙏

jaimesastre
Автор

Kommt die Fortsetzung noch? Bisher alles sehr gut beschrieben...

MrSchwede
Автор

Is sigma 0 or 1 in this example?
The title of the graph says it is 0, but doesn't the code say it equals 1?

IvanStar
Автор

hello,
how to feed sequence of input data to train sequences of outputs

yeshuip