Variational Inference: Foundations and Innovations

preview_player
Показать описание
David Blei, Columbia University
Computational Challenges in Machine Learning
Рекомендации по теме
Комментарии
Автор

Wonderful Talk~~~~
7:45 start
10:49 GMM model example
13:37 LDA example
22:42 Conditionally conjugate models
28:22 ELBO
30:52 Mean-field VI
37:27 Stochastic VI
48:07 Black box VI
1:00:47 Reparameterization and amortization

whaleshark
Автор

“Great question! I wish this talk was over so I could go and think about it”

pauloabelha
Автор

Check Blei's latest talk on this topic:

And the 2016 NIPS tutorial talk:

bnglr
Автор

amazing! brief presentation but gives deep insights

jiachenlei
Автор

I understand we measure the distance between two distributions using KL divergence, but am still very confused. How do we know whether we are getting closer to the actual posterior distribution if we do not know the posterior distribution?

monart
Автор

28:08 On the "bad properties of KL divergence" and alternative measures of divergence;
Does anyone have any things to point to? Very interesting

ewfq
Автор

Can we have access to the slides please?

martindelgado