Foundations and Challenges of Deep Learning (Yoshua Bengio)

preview_player
Показать описание

Having read, watched, and presented deep learning material over the past few years, I have to say that this is one of the best collection of introductory deep learning talks I've yet encountered. Here are links to the individual talks and the full live streams for the two days:

Full Day Live Streams:

Рекомендации по теме
Комментарии
Автор

This is one of my favorite talks on deep learning, period.

The observation that saddle points (not local minima) dominate the loss surfaces of high-D multilayer nets -- and that most local minima are located close to the global minimum -- is one of several convincing reasons why deep neural nets work surprisingly well after reaching convergence. See 25:00 minute mark.

sadesoji
Автор

Some references from the video for quick reference:


kevinurban
Автор

"The point I really want to talk about is the fourth one, how do we defeat the curse of dimensionality? In other words, if you don't assume much about the world, it's actually impossible to learn about it."

thefunfamilytrees