The Key Equation Behind Probability

preview_player
Показать описание

Socials:

My name is Artem, I'm a graduate student at NYU Center for Neural Science and researcher at Flatiron Institute (Center for Computational Neuroscience).

In this video, we explore the fundamental concepts that underlie probability theory and its applications in neuroscience and machine learning. We begin with the intuitive idea of surprise and its relation to probability, using real-world examples to illustrate these concepts.
From there, we move into more advanced topics:
1) Entropy – measuring the average surprise in a probability distribution.
2) Cross-entropy and the loss of information when approximating one distribution with another.
3) Kullback-Leibler (KL) divergence and its role in quantifying the difference between two probability distributions.

OUTLINE:
00:00 Introduction
02:00 Sponsor: NordVPN
04:07 What is probability (Bayesian vs Frequentist)
06:42 Probability Distributions
10:17 Entropy as average surprisal
13:53 Cross-Entropy and Internal models
19:20 Kullback–Leibler (KL) divergence
20:46 Objective functions and Cross-Entropy minimization
24:22 Conclusion & Outro

CREDITS:
Special thanks to Crimson Ghoul for providing English subtitles!

Рекомендации по теме
Комментарии
Автор

So far... One of the best clear videos about Entropy and KL-divergence...
Good Motion Design too...

mounirgharsallah
Автор

This is Artem Kirsanov's golden year. Posting banger after banger. Much love, your videos are a gem <3

TurinBeats
Автор

Wow, what an amazing video and explanation! I love how you derived KL divergence (also known as relative entropy) from cross-entropy and entropy.
It's interesting to note that historically, these ideas were actually discovered in the reverse order. Kullback and Leibler introduced the concept of "information gain" or "relative entropy" (now known as KL divergence) in their 1951 paper "On Information and Sufficiency, " building on Shannon's earlier work on entropy. The explicit use of cross-entropy as a separate concept came later as far as I know.

Your explanation really helps in understanding these interconnected ideas. Thank you for this excellent content!

Variational inference video with this quality would be simply incredible (I really cannot imagine the amount of effort this requires), again thank you for this

gonzalopolo
Автор

Personally, after the third time someone predicted the die roll, I would be exponentially more surprised than after the first time

klikkolee
Автор

The most succinct explanation of entropy I have heard, the explanation of cross-entropy was very insightful too

MathOnMain
Автор

I'd be very keen for a video about variational inference. Have been loving your content

finnrobertson
Автор

This video is the new golden standard of stat introduction for dummies; so clear and informative. I'll link it to the next person I find at the beginning of its stat journey! Tks bro for what u do

andreapanuccio
Автор

Very impressive how clearly you explained entropy, cross-entropy, and KL divergence with the idea of surprise and great visuals. Well done and thank you for this

drhxa
Автор

Cross entropy explains magic shows.
The magician is trying to get the audience to believe the wrong model ("Nothing up my sleeve"") and therefore be surprised (and hopefully delighted) by the outcome.

jimcallahan
Автор

You called my soul since you mention entropy in the title

Carrymejane
Автор

This is the clearest explanation of probability and entropy I've ever seen. Please create more videos.

brucerosner
Автор

That's one of the best videos I have ever seen on the probabilistic foundations of ML!

vanhoheneim
Автор

Amazing video. It's so rare finding such quality nowadays

davide
Автор

Really liked the way u motivated the definition of entropy, thanks a lot

sudiptochatterjee
Автор

What an amazing introduction to this point of view of probability!

AlexBerg
Автор

This is really good and high quality content !! You have no idea this is the first time I've been able to stay watch a math youtube video from start to finish in a very long time

FsimulatorX
Автор

Thank you for solving my long-lasting questions

bingyanliu
Автор

Simple to follow, and crystal clear. Well done!

xyzct
Автор

your animation is so incredible, i even watched the add.

benfrank
Автор

Brilliant, Artem! Your videos are the crown jewels of ML educational content on YouTube!! So intuitive!

andrewgrebenisan