(Doubly) Semi-Implicit Variational Inference

preview_player
Показать описание

This is a follow-up to the first talk of this year in which I will tell you more about variational inference with implicit distributions. This time, we will assume that the approximate posterior and the prior can be both expressed as an intractable infinite mixture of some analytic density with a highly flexible implicit mixing distribution. It turns out, this formulation allows one to perform both variational inference and variational learning and gives a sandwich bound on the ELBO which is asymptotically exact. At the end of the talk, I will tell you a bit about the use cases for (doubly) semi-implicit variational inference and learning and our experimental results.
Рекомендации по теме