Scott Clark - Using Bayesian Optimization to Tune Machine Learning Models - MLconf SF 2016

preview_player
Показать описание

Using Bayesian Optimization to Tune Machine Learning Models: In this talk we briefly introduce Bayesian Global Optimization as an efficient way to optimize machine learning model parameters, especially when evaluating different parameters is time-consuming or expensive. We will motivate the problem and give example applications.

We will also talk about our development of a robust benchmark suite for our algorithms including test selection, metric design, infrastructure architecture, visualization, and comparison to other standard and open source methods. We will discuss how this evaluation framework empowers our research engineers to confidently and quickly make changes to our core optimization engine.

We will end with an in-depth example of using these methods to tune the features and hyperparameters of a real world problem and give several real world applications.
Рекомендации по теме
Комментарии
Автор

Wow finally someone who understands this problem

samirelzein
Автор

what happens when the funciton that we sample ist of stochastic instead of deterministic nature? would gaussian processes( or another surrogate model) and thus bayesian optimization have a way to deal with that? EDIT: im referring to what he scott states at around 13:10

dominicmutzhas