Evidence Lower Bound (ELBO) - CLEARLY EXPLAINED!

preview_player
Показать описание
This tutorial explains what ELBO is and shows its derivation step by step.

#variationalinference
#kldivergence
#bayesianstatistics
Рекомендации по теме
Комментарии
Автор

Cristal clear explanation, the world needs more people like you!

AndreiMargeloiu
Автор

Again, thank you. This is incredible well explained, the small steps and the explanation behind, pure gold.

TheProblembaer
Автор

Absolutely beautiful. The explanation is so insanely well thought out and clear.

genericperson
Автор

Best explanation ever! I found this video for my understanding of VAE at first, but I recently found that this is also directly related to diffusion models. Thanks for making this video.

sonny
Автор

That was great, been going through paper after paper, all I needed was this! Thanks!

speedbird
Автор

Thanks, your tutorial cleared my doubts!!

thatipelli
Автор

Best explanation I have found so far, thank u!

bevandenizclgn
Автор

Thank you so much for this explanation :) Very clear and well explained. I wish you all the best

FredocasS
Автор

Insane explanation Mr. Sachdeva! Thank you so much - I wish you all the best in life

T_rex-teus
Автор

Fantastic tutorial!! Hoping to see more similar content. Thank you

danmathewsrobin
Автор

excellent presentation and explanation
Thank you very much sir

schrodingerac
Автор

This one is masterpiece. Can you please put one video on Hierarchical Variational AutoEncoders when you have time. Looking forward to it.

AruneshKumarSinghPro
Автор

Thankyou so much sir ! I'm glad that I found your video 💯

vihnupradeep
Автор

Very clear explanation! Thank you very much!

brookestephenson
Автор

Great Explanation. Can you tell me which books / articles that I may refer to for further and deeper reading regarding variational inferences, bayesian statistics and concepts related to in depth probability?

ajwadakil
Автор

Great explanations! I do have one correction to suggest: At (6:41) you say D_KL is always non-negative; but this can only be true if q is chosen to bound p from above over enough of their overlap (... for the given example, i.e. reverse-KL).

HelloWorlds__JTS
Автор

Thanks for the lecture sir! I have a question at 4:54, how did you expand that E[log_p_theta(x)] into Thanks!

easter.bunny.
Автор

Good explanation. I can follow the algebra easily. The problem is this: what is known and what is not known in this formulation? In other words, @0:26, I think we try to find the posterior. But, do we know the prior? Do we know the likelihood? Or, is it that we do not know them but can sample them?

sahhaf
Автор

Amazing tutorial! Keep up the good work.

Aruuuq
Автор

This is an awesome explaination. Thank you.

chethankr