Modified Newton method | Exact Line Search | Theory and Python Code | Optimization Algorithms #4

preview_player
Показать описание

In this one, I will show you what the modified newton algorithm is and how to use it with the exact line search method. We will approach both methods from intuitive and animated perspectives. The difference between Damped and its modified newton method is that the Hessian may run into singularities at some iterations, and so we apply diagonal loading, or Tikhonov regularization at each iteration. As a reminder, Damped newton, just like newton’s method, makes a local quadratic approximation of the function based on information from the current point, and then jumps to the minimum of that approximation. Just imagine fitting a little quadratic surface in higher dimensions to your surface at the current point, and then going to the minimum of the approximation to find the next point. Finding the direction towards the minimum of the quadratic approximation is what you are doing. As a matter of fact, this animation shows you why in certain cases, Newton's method can converge to a saddle or a maximum. If the eigenvalues of the Hessian are non positive - in those cases the local quadratic approximation is an upside down paraboloid.

⏲Outline⏲
00:00 Introduction
00:55 Modified Newton Method
03:41 Exact line search
04:55 Python Implementation
18:46 Animation Module
34:14 Animating Iterations
37:02 Outro

📚Related Courses:



🔴 Subscribe for more videos on CUDA programming
👍 Smash that like button, in case you find this tutorial useful.
👁‍🗨 Speak up and comment, I am all ears.

💰 If you are able to, donate to help the channel

#python #optimization #algorithm
Рекомендации по теме
Комментарии
Автор

Why are you dropping three videos (each about 40 minutes) a week ? Please take care of your wellbeing, Ahmad :) We love you <3

yasindemirel
Автор

I don't understand why some people hating, - yes, Prof Ahmad missed a couple of symbols (once in a lifetime) but he's still the best !

ak_amazingkinggaming
Автор

This will complete my Linear Algebra / Calculus /Optimization trio I was missing to truly understand Machine Learning.

rasxodmusic
Автор

That's correct, the value shown in the video is a local minimum, and the function is unbounded for large negative x and y values.

razanalkado
Автор

I like how you make examples to explain the concepts of each method! Brilliant and easy to understand! Thank you so much!!!

FuadEkberov
Автор

This is the best explanation of Optimization I have ever come across. This guy is talented

SadManEdition
Автор

I am studying at Erasmus University Rotterdam for Non-Linear Optimization. This video helped me significantly. Thank you for your efforts.

kral-brawlstars
Автор

This and the video on the second Wolfe condition helped me understand completely the concepts and the intuition behind these criteria. Thank you very much ! Amazing explanations.

nursyazwani
Автор

Amazing lectures! I'm using these to study for my Continuous Optimization course

JosephBenson
Автор

It's a good example of why it's important to understand the shape of the function you are optimizing, and one of the potential downfalls of gradient based solvers.

dannycashy
Автор

Brilliant! One of the best explanation to descent gradient I have ever seen. Well done man

Informationalcreator
Автор

I really like the way you explained the optimizing algorithm. love it :)

ankushsharma
Автор

سلام عليكم دكتور ممكن اسهل اوالافضل لخوارزميات ضمن ال Meta- heuristic Optimization موفقين ان شاء الله بحق محمد وال محمد

anuragtech
Автор

Thankyou sooo much sir for such a simple and brilliant graphical illustrations.

oromoshow
Автор

Excellent explanations thank you! Will watch more

thevowtv
Автор

My professor at Stanford recommended this channel.

erenhd
Автор

This nailed down the Adam paper. Thanks alot

dj.emyemy
Автор

This lecture was recommended by my professor at Harvard

yahiabenliram
Автор

This is incredible .. crystal clean explanations.

arifdrgn
Автор

Cannot thank you enough professor Ahmad Bazzi.

tamerolur