Modern Optimization Methods in Python | SciPy 2017 Tutorial | Michael McKerns

preview_player
Показать описание
There are audio issues with this video that cannot be fixed. We recommend listening to the tutorial without headphones to minimize the buzzing sound.

Highly-constrained, large-dimensional, and non-linear optimizations are found at the root of most of today’s forefront problems in statistics, quantitative finance, risk, operations research, materials design, and other predictive sciences. Unfortunately, the evolution of tools for optimization has not generally kept pace with the demand for solving larger and more complex problems with increasing accuracy. However, recently, the abundance of parallel computing resources has stimulated a shift away from using reduced models to solve statistical and predictive problems, and toward more direct methods for solving high-dimensional nonlinear optimization problems.

This tutorial will introduce modern tools for solving optimization problems -- beginning with traditional methods, and extending to solving high-dimensional non-convex optimization problems with highly nonlinear constraints. We will start by introducing the cost function, and it’s use in local and global optimization. We will then address how to monitor and diagnose your optimization convergence and results, tune your optimizer, and utilize compound termination conditions. This tutorial will discuss building and applying box constraints, penalty functions, and symbolic constraints. We will then demonstrate methods to efficiently reduce search space through the use of robust optimization constraints and kernel transformations. Real-world inverse problems can be expensive, thus we will show how to enable your optimization to seamlessly leverage parallel computing. Large-scale optimizations also can greatly benefit from efficient solver restarts and the saving of state. This tutorial will cover using asynchronous computing for results caching and archiving, dynamic real-time optimization, and dimensional reduction. Next we will discuss new optimization methods that leverage parallel computing to perform fast global optimizations and n-dimensional global searches. Finally, we will close with applications of global optimization in statistics and risk.

The audience need not be an expert in optimization, but should have interest in solving hard real-world optimization problems. We will begin with a walk through some introductory optimizations, learning how to build confidence in understanding your results. By the end of the tutorial, participants will have working knowledge of how to use modern constrained optimization tools, how to enable their solvers to leverage high-performance parallel computing, and how to utilize legacy data and surrogate models in statistical and predictive risk modeling.
Рекомендации по теме
Комментарии
Автор

For anyone having difficulty with low audio volume. You can install a browser extension like volume booster to increase the volume. Will make the audio a little more noisy but will be better than not being able to hear at all. Hope that helps!

freewannabe
Автор

I listened *1.5 speed, strongly recommend.

Catatom
Автор

Great talk. But the audio is too low volume. Please correct the volume and re-upload. Thanks

RedShipsofSpainAgain
Автор

which kind of textbook is he talking about at 1:57 ? (subtitles XD)

dhananjaydadheech
Автор

Does anyone has the link to the notebook used in this tutorial? High appreciated if it can be publicly share here, or through private email!

thuytrinht
Автор

Hey, I'm trying to solve an optimzation problem involving one linear obejctive function and thousands of variables, constraints (some of which are non-linear). So it's, strictly speaking, not a linear programming nor a quadratic programming. It's a non-linear programming. I wonder if you know any derivative free packcage capable of dealing with this, other than CVXOPT or GEKKO. I've spent some time on learning both, but the tutorial of both are both lengthy and I can't seem to find any solution coming out of either. Thank you! I'm programming in Python of course.

lizimoodyspecter
Автор

why is there [3] in 1st example? X= opt.fmin(objective, [3])

surajadhikari
Автор

cant see the screen at all. If ur going to make a video make it properly. People made better videos in 1925 than this one.

at
Автор

truly boring tone of explanation for an interesting topic.

behzadnourani
Автор

Let me see... the secret... Thank's--

raulbarrios
welcome to shbcf.ru