DDPS | Bayesian Optimization: Exploiting Machine Learning Models, Physics, & Throughput Experiments

preview_player
Показать описание
We report new paradigms for Bayesian Optimization (BO) that enable the exploitation of large-scale machine learning models (e.g., neural nets), physical knowledge, and high-throughput experiments. Specifically, we present a paradigm that decomposes the performance function into a reference model (e.g., obtained from physics) and a residual model and show how learning the residual from data using a Gaussian Process (GP) can help accelerate the search. We also show how we can leverage the use of a reference model to partition the design space and enable parallel search in order to exploit high-throughput experiments. In addition, we present a BO implementation that enables the use of large-scale, parametric models by designing a new acquisition function that includes basic (but scalable) uncertainty quantification capabilities. We provide motivating examples arising in controller tuning for energy systems, reactor optimization, and design of microbial communities.

Bio: Victor M. Zavala is the Baldovin-DaPra Professor in the Department of Chemical and Biological Engineering at the University of Wisconsin-Madison and a senior computational mathematician in the Mathematics and Computer Science Division at Argonne National Laboratory. He holds a B.Sc. degree from Universidad Iberoamericana and a Ph.D. degree from Carnegie Mellon University, both in chemical engineering. He is on the editorial board of the Journal of Process Control, Mathematical Programming Computation, and Computers & Chemical engineering. He is a recipient of NSF and DOE Early Career awards and of the Presidential Early Career Award for Scientists and Engineers (PECASE). His research interests include statistics, control, and optimization and applications to energy and environmental systems.

LLNL-VIDEO-846190

#LLNL #MachineLearning #DataDrivenPhysicalSimulations
Рекомендации по теме