Intuitively Understanding the KL Divergence

preview_player
Показать описание
This video discusses the Kullback Leibler divergence and explains how it's a natural measure of distance between distributions. The video goes through a simple proof, which shows how with some basic maths, we can get under the KL divergence and intuitively understand what it's about.
Рекомендации по теме
Комментарии
Автор

I just want to say. This is--by far--the best explanation of KL divergence I've found on the internet. Thanks so much!

Vroomerify
Автор

The most intelligent people are the one's who are able to explain the hardest concepts in the most intutive way possible. Thanks.

unsaturated
Автор

This was actually one of the most helpful videos. Thank you

niofer
Автор

KL divergence confused me for so long, and I understood it just by watching your video for one time, thank you very much!

liliz
Автор

A question here why will the number of heads and number of tails be the same for both the distributions at 3:04. If the probabilities for both the coins are different then the number of occurrences of heads and tails can also be different

AashraiRavooru
Автор

This type of explanation is perfect! First boiling the problem down to the most intuitive understanding and from there deduce the general formula. Thanks so much!

karstenhannes
Автор

Best explanation of the KL divergence in YouTube for sure....
Thanks...

RedwanKarimSony_napstar_
Автор

You are unbelievably good at teaching man. You explained it better than they did in my course.

nericarcasci
Автор

holy smoke, you are legit GOAT. so concise yet clear and intuitive explanation.

jimmygan
Автор

Excellent video. Can someone help me understand why is it called Divergence in the first place? Why are we taking 1/N power to normalise it to sample space, I did not understand the logic behind this.

SunilKumarSamji
Автор

I'm just rewatching this video to freshen up my deep learning fundamentals. Super clear video, thank you so much!

wynandwinterbach
Автор

@3:26 I don't understand how are we normalizing by raising it to the power of 1/N. Could you please explain that?

balasubramanyamevani
Автор

I didn't expect that good explanation from a randomly suggested youtube video

baskaisimkalmamisti
Автор

Great explanation! One technical remark I have is that (from my understanding) KL divergence is not technically a measure of distance, since it's not symmetric ( Dlk(P||Q) != Dlk(Q||P) ).

haresage
Автор

Perfectly explained in 5 minutes. Wow.

adityakulkarni
Автор

Bro, this intuition was not normal, u r just genius!!

sharingpurpose
Автор

Thanks so much for this, needed to understand what KL Divergence is for a paper I'm reading and you just saved me so much time!

matakos
Автор

Great video! Loved the intuition behind the KL distribution. For some thinking about applications, this is used in the loss function of Variational Auto Encoders, a class of deep networks, and is used to find low dimensionality features of high dimensionality input data as an encoder. (e.g. use this to deconstruct images into "features")

marcegger
Автор

One of the most useful explanations ever. Thanks!!

germangarcia
Автор

Thank you so much for this content. By far the explanation of KL Divergence seen so far

alkanair