Karolina Dziugaite on Nonvacuous Generalization Bounds for Deep Neural Networks via PAC-Bayes

preview_player
Показать описание
Abstract: Karolina presents her recent work constructing generalization bounds in order to understand existing learning algorithms and propose new ones. Generalization bounds relate empirical performance to future expected performance. The tightness of these bounds vary widely, and depends on the complexity of the learning task and the amount of data available, but also on how much information the bounds take into consideration. Her work is particularly concerned with data and algorithm-dependent bounds that are quantitatively nonvacuous. She presents bounds built from solutions obtained by stochastic gradient descent (SGD) on MNIST. By formalizing the notion of flat minima using PAC-Bayes generalization bounds, we obtain nonvacuous generalization bounds for stochastic classifiers built by randomly perturbing SGD solutions.

Lecture given in on June 2018 in Toronto, ON at the Borealis AI lab.
Рекомендации по теме