Probability - Math for Machine Learning

preview_player
Показать описание
In this video, W&B's Deep Learning Educator Charles Frye covers the core ideas from probability that you need in order to do machine learning.

In particular, we'll see why mathematically rigorous probability theory is so challenging, and then go over why negative logarithms of probabilities, aka "surprises", show up so often in machine learning.

0:00 Introduction
1:45 Probability is subtle
7:45 Overview of takeaways
8:47 Probability is like mass
17:51 Surprises show up more often in ML
21:46 Surprises give rise to loss functions
24:31 Surprises are better than densities
33:35 Gaussians unite probability and linear algebra
39:57 Summary of the Math4ML ideas
41:40 Additional resources on Math4ML
Рекомендации по теме
Комментарии
Автор

Dr Frye on fire as usual. 🔥 Really useful series, thanks

jeffnc
Автор

Lots of great stuff, I'm still digesting articles from calculus video links; than stats coming out!

tam
Автор

You completely "suprised" me :). Some of the interpretations were completely new to me. From where did you learn these?

shubhamtalks
Автор

So entropy formula is like that, they why saying surprises are in bits? here bit mean 0/1 ?

macknightxu
Автор

why all assumptions use iid, however, data in real life doesn't fit iid ?

mohamedtarek
Автор

Amazing team Weights and Biases❤💥🔥
Please, consider making one on Information Theory🤓

mdrasel-ghyf
Автор

phenomenal video Weights & Biases. I smashed the thumbs up on your video. Continue to keep up the exceptional work.

KeyserTheRedBeard
Автор

2 plus 2 equals 5 is possible when calculating incorrectly😄

macknightxu
Автор

what do you mean by "surprise"? it means a strange thing?

macknightxu