The KL Divergence : Data Science Basics

preview_player
Показать описание
understanding how to measure the difference between two distributions

0:00 How to Learn Math
1:57 Motivation for P(x) / Q(x)
7:21 Motivation for Log
11:43 Motivation for Leading P(x)
15:59 Application to Data Science
Рекомендации по теме
Комментарии
Автор

Wow... 😳 I've never seen more genius, easy and intuitive explanation of KL-div 😳👏👏👏👏👏 Big thanks good man ! ❤️

szymonk.
Автор

I am a research scientist. You provide a clear and concise treatment of KL-Divergence. The best I have seen to date. Thanks.

zafersahinoglu
Автор

That was the best description of why we use log that I have ever seen. Good work, man.

DS-vuyo
Автор

Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!

murkyPurple
Автор

I don't think I'm ever going to forget this. Thanks so much.

varadpuntambekar
Автор

you are probably the best teacher I've ever seen, and I've learned from tons of people online like Andrew Ng, Andrej Karpathy, MIT lecture series, Brad Traversy, Statsquest.

usethisforproductivity-tgxq
Автор

Wow. This is the best explanation of KL-divergence I've ever heard. So many over-complicated stuff out there but yours is absolutely genius.

marka
Автор

This is the most intuitive explanation for any statistics problem.

steamedbean
Автор

I'm in the middle of a $2, 500 course, BUT → YouTube → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.

KippSchoenwald-mu
Автор

This is mind blowing.... I love the way you go from the problem to the solution, it's clever way to understand this KL divergence

trungphan
Автор

I am a postdoc studying information theory and language. This is the best KL divergence explanation I've heard. I don't think I am going to forget it. :) Thanks!

elevenyhz
Автор

That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.

JBoya
Автор

Great video! One small thing I noticed is that usually, P is the reference or true distribution and Q is the proposed distribution. It's the opposite of what you mentioned at 5:43

michaelzyang
Автор

Clear, simplified, the best approach to lead to why to use a formula. Thank you!!

Oliprod
Автор

The comments didn't lie you actually explained this so well. I watched the ads all the way through btw.

SSJVNN
Автор

I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai.
That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me.
I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake!
And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.

brandonkim
Автор

Just fantastic! Even if I forget the formula for KL divergence, I can "re-engineer" it on demand.

AdeOlubummo
Автор

Excellent way to explain the concept of KLD. I landed on this video after checking 4-5 other tutorials but none of those match the easeness of this tutorial. Thnanks.

nileshchandrapikle
Автор

Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!

eagermage
Автор

Man you are amazing,
I am gonna binge watch all the videos for better intuitive understandings

raafeyazher