filmov
tv
Adjoint Sensitivities of a Linear System of Equations - derived using the Lagrangian
Показать описание
We can also arrive at the equations for the adjoint sensitivities of a linear system using a different point of view. Here, we frame it as an equality-constrained optimization problem. Then, we can build a Lagrangian of the problem, which total derivative is identical to the sensitivities, we are interested in. The involved Lagrange Multiplier can be chosen arbitrarily, since the primal feasible already constraints our minimum. So, let's choose in a way such that we avoid the computation of a difficult quantity. And that's all the magic! :) After some more manipulation, we arrive at the same equations as in the previous video.
-------
-------
Timestamps:
00:00 Introduction
00:49 Similar to using implicit differentiation
01:15 Implicit Relation
01:48 Dimensions of the quantities
02:26 Lagrangian for Equality-Constrained Optimization
03:37 Total derivative of Lagrangian
05:02 Gradient is a row vector
07:31 The difficult quantity
08:33 Clever Rearranging
09:27 Making a coefficient zero
10:31 The adjoint system
12:01 The gradient is now easier
12:37 Total derivative of Loss
14:35 Strategy for d_J/d_theta
15:47 Scales constantly in the number of parameters
16:27 The derivatives left in the equation
17:01 Outro
Комментарии