Kullback–Leibler divergence (KL divergence) intuitions

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

Thank you for making a very intuitive video about the KL divergence 🙏

KianSartipzadeh
Автор

Amazing explanation and the code is such a smart idea
Thank you for sharing🙏

BluBr
Автор

This is probably the best and simple explanation.. Thanks @CabbageCat for the video
👍

priyankjain
Автор

I think the points on the PDF curves are not probability values as probability values at those points are 0 when considering continuous random variables. The integration between those points actually results in a probability value. Hence, when you integrate from 0 to infinity, the area under the curve results in 1 (probability cannot exceed the value of 1)

sunasheerbhattacharjee
Автор

Perhaps we can find two different distributions with the 0 KL divergence? why not P*|log(P/Q)|?

loliloloso
Автор

What do you mean by the statement that the “positive and negative log ratios will cancel each other out?”

Attempting to verify this, suppose we have X∈{1, 2, 3, 4} and two simple PMFs:
- P(X), with probabilities 0.1, 0.2, 0.3, and 0.4 respectively
- Q(X), with probabilities 0.25, 0.25, 0.25, and 0.25 respectively

But ln(0.1/0.25) + ln(0.2/0.25) + ln(0.3/0.25) + ln(0.4/0.25) = -0.487109, not 0. Perhaps I’m doing something wrong/misinterpreting the video, but I don’t get why this should be true.

blackkyurem
welcome to shbcf.ru