Optimization | MIT Computational Thinking Spring 2021 | Lecture 16

preview_player
Показать описание

Contents
00:00 Waiting for the class to start
03:25 Welcome, announcements
04:13 Introduction
06:15 Julia concepts
07:45 Line Fitting Many Ways
08:35 Exploratory Data Analysis
09:30 Least Squares fitting to a straight line
10:15 Direct Formulas
11:38 The Linear Algebraist's Formula
13:38 Optimization Methods
18:15 Functions of Functions and Computing Power
19:40 JuMP.jl: A popular modelling language for Optimization Problems
22:20 Gradients
23:23 Hand Computation
26:35 Finite Difference Evaluation
28:18 Automatic Differentiation (AutoDiff)
31:25 Gradient Descent can be difficult for complicated functions
40:24 Stochastic Gradient Descent

Рекомендации по теме
Комментарии
Автор

adaptive eta: keep track of the change in loss and if it increases, eta /= 10

sgwbutcher