Variational Inference | Evidence Lower Bound (ELBO) | Intuition & Visualization

preview_player
Показать описание


-------

-------

Timestamps:
00:00 Introduction
00:54 Problem of intractable posteriors
02:10 Fixing the observables X
02:29 The "inference" in variational inference
03:29 The problem of the marginal
05:06 Remedy: A Surrogate Posterior
06:11 The "variational" in variational inference
06:38 Optimizing the surrogate
08:47 Recap: The KL divergence
09:42 We still don't know the posterior
10:35 Deriving the ELBO
15:17 Discussing the ELBO
17:59 Defining the ELBO explicitly
18:24 When the ELBO equals the evidence
18:56 Equivalent optimization problems
20:38 Rearranging for the ELBO
21:08 Plot: Intro
22:32 Plot: Adjusting the Surrogate
24:02 Summary & Outro
Рекомендации по теме
Комментарии
Автор

this is probably the best explanation I've yet to see on this topic and I've tried to understand it multiple times, this helped a lot thank you! :)

gabormolnar
Автор

I'm doing my master's thesis on the applications of AI in Architecture. Having no significant mathematic training since high school, this video was absolutly invaluable for a math-illiterate like me to gain a deeper insight the mechanism behind a VAE. Excellently explained. 10/10

maximilianglaus
Автор

I couldn't run without paying for this AWESOME lecture. Finally, I'm not afraid dang ELBOish. Thanks from South Korea! :)

나는강아지-wx
Автор

You saved me! I was so frustrated that I could not understand it, but you video is so clear and understandable!

kai-oqlb
Автор

Thank you so much! Your explanation is so clear and easy to follow, as opposed to other videos and blogs which either shy away from the derivation or user phrases like 'by simple calculus' to jump straight to the expression

myfolder
Автор

i love you man, i have literally spent 20+ hours to understand this. most of the explanations i found are so hand wavy, thank you so much for spending so much time yourself to understand this and then to make this video

amansinghal
Автор

vielen vielen dank für deine Videos! Kann kaum in Worte fassen wie sehr du mir geholfen hast, die Thematik zu verstehen :D

glatteraal
Автор

Errata: (Thanks to everyone commenting and spotting the errors :) )


Error at 15:42 : p(D) as well as log p(D) correspond to the evidence. Evidence is just the marginal probability evaluated at the (observed) data. Hence, it is incorrect to say it becomes the evidence after applying the logarithm. Thanks to @D. Rafaeli for pointing this out (see also his comment)

Error at 19:20 : I say that we found the posterior if the ELBO was equal to zero. This is not correct. We would have found the exact posterior, if the ELBO was equal to the (log) evidence, because then the KL is zero which is the divergence measure between the surrogate and the truth. Thanks to @Dave of Winchester for pointing this out. Also see his comment for more details.


Error at 19:50 : I wrongly write down the joint p(Z, D), but I mean (and also say) the posterior, i.e. p(Z | D)

Error at 22:28 : Correct would be maximizing the ELBO and minimizing the KL, but it should be clear based on the context

MachineLearningSimulation
Автор

This is a fantastic video! The intution was explained beautifully and I finally understood all the concepts. Thank you so much

MightyGAN
Автор

A nice simulation in the last of the video helps a lot in understanding this concept very intuitively.

relaxingnaturalvibrations
Автор

Around the middle of the video, I gave a like. Towards the end of it, I had no choice but to take my like back... because I needed to like the video again!!

melihozcan
Автор

It's my third day of trying to understand basics of EM and ELBO and I found this video. Now, there won't be a forth. Thankyou

KomilParmar-gtrr
Автор

The video is very nicely organized, I would like to point out to the author and audience that KL divergence is not a distance as it is not symmetric, that's why it is called divergence and not KL distance

AkshayRoyal
Автор

Wonderful explanation. Certainly one of the best I’ve seen on this topic!

wedenigt
Автор

Thanks a lot! This made me finally understand the ELBO. I really appreciate that you focus on explaining the intuition behind it.

paulstahlhofen
Автор

thank you for this brilliant video, this is the best explaination I have seen so far about this topic.

themeend
Автор

Finally, a really good explanation, and I have seen a few. Thanks! And I'm talking from a perspective of someone who read Bishop ;)

paaabl.
Автор

This is a terrific explanation!! Everything I was looking for! Thank you so much

logannelson
Автор

I have been trying to understand this topic because it keeps popping up with variational autoencoders and this video explains it so well!. Thank you!

shiskaraut
Автор

This is an excellent lecture on variational inference. Thanks for the effort.

mashfiqulhuqchowdhury