Demystifying Variational Inference (Sayam Kumar)

preview_player
Показать описание
Speaker: Sayam Kumar

Title: Demystifying Variational Inference

Event description:
What will you do if MCMC is taking too long to sample? Also what if the dataset is huge? Is there any other cost-effective method for finding the posterior that can save us and potentially produce similar results? Well, you have come to the right place. In this talk, I will explain the intuition and maths behind Variational Inference, the algorithms capturing the amount of correlation, out of the box implementations that we can use, and ultimately diagnosing the model to fit our use case.

Discourse Discussion

## Timestamps
- 0:00 Start of event
- x:xx
- x:xx

## Note: help us add timestamps here

Speaker bio:
Sayam Kumar is a Computer Science undergraduate student at IIIT Sri City, India. He loves to travel and study maths in his free time. He also finds Bayesian statistics super awesome. He was a Google Summer of Code student with NumFOCUS community and contributed towards adding Variational Inference methods to PyMC.

Speaker info:

Part of PyMCon2020.

#bayesian #statistics
Рекомендации по теме
Комментарии
Автор

Good video. Just a quick question: When we take gradients of the ELBO wrt the variational parameters, can we actually write the MC Approximation given in 10:22 ? gradients of expectation will not be equal to the Expectation of gradients when we're sampling from the same distribution wrt which we're taking gradients. In ADVI, we end up reparametrizing to Standard Normal DIstribution. Any explanation would be helpful

sobanlone