Entropy | Cross Entropy | KL Divergence | Quick Explained

preview_player
Показать описание
After a long time, finally, here's a topic which inherited in a lot of things, especially when it comes to generative modeling. In this video, we'll see about Entropy, Cross-Entropy, and most important KL Divergence which is used frequently in Generative Adversarial Networks.
Let's see what it is, how it originated, and what it does.
I hope you'll like it. If not, please leave your feedback in the comments.
And as always,
Thanks for watching ❤️

Timestamps:
0:00 Entropy
2:13 Cross Entropy
3:23 KL Divergence
4:35 KL Divergence in code
5:25 Things to remember
Рекомендации по теме
Комментарии
Автор

this video is just a treasure. the simplest explanation. thanks!!

pashamorozov
Автор

Nice explaination
Thank you for this clear presentation

durandekamga
Автор

im really thankful with your video, its so simple and make me understand more clearly about kl divergence <3 i hope i found your video earlier

ucembaikbangerparah
Автор

finally a very concise explanation on youtube.thank you very much.

utkuaslan
Автор

Really Insightful Explanation that solves things struggled me the most!

johnny
Автор

Thanks. Really well put together. Excellent explanation.

lincolnsimms
Автор

What will happen if instead of one-hot encoding hard-labels we have soft labels distribution?

PoojaKumawat-zi
Автор

Thank you so much for amazing explanations!
What software do you use to create animations in your videos?

michoello
Автор

Your video is great, but i see some mistake in the cross entropy calculation ln(0.2)= -1.6 not 1.6, also ln(0.15) and ln(0.05) . I think you just forgot minus sign. 😅

dareeul
visit shbcf.ru