Deep Learning Lecture 3: Maximum likelihood and information

preview_player
Показать описание
Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Рекомендации по теме
Комментарии
Автор

Fantastic videos, I appreciate how open you're being with the coursework!

MadcowDeity
Автор

Out of the entire course this is easily the most important lecture. I've watched it several times to really internalize it.

mpete
Автор

Your energy is appreciable in this whole playlist!

osamamustafa
Автор

Thank you very much for posting these lectures! So far, I find them interesting and understandable.

paulthomann
Автор

very clear connection between least square loss,  MLE and KL divergence. Thanks.

CW
Автор

This lecturer teaches more than my lecturer in 1 hour. So good. Please teach in my uni )):

dbskluvu
Автор

59:55 Why does the prediction for new data x_* calculate P(y|x_*, D, σ) instead of (x_*)^T Θ ?

aa-vbvh
Автор

Better explanation than Bishop's chapter 1 : )

dragonlorder
Автор

Good content for beginners ! I strongly advise against your method to simulate gaussian variables though. It is a general method (the use of the inverse of the distribution function) that is really bad for gaussian variables in terms of complexity. The two best way to go are Box-Müller's method and Marsaglia's method (and from my own experience their performance are equivalent). That being said, I enjoyed this video, thank you for sharing !

treflir
Автор

At the very end, how closely related is cross entropy to that KL/MLE relationship? It seems that you have it as the variable term in 1:12:08 but I'm not sure.

VictorChavesVVBC
Автор

At the end of each video, it would be great to include a "Summary for Practitioners"...that distills the theory to practice...the why and the what.

npabbisetty
Автор

I like the video series, but the audio needs work. The constant background buzz is really distracting.

cognitiveinstinct
Автор

The guy is not good with mathematics, losing logarithms, confusing probabilities and values of density funcitions. But thanks anyway.

maiiabakhova