Symmetric Rank 1 | Exact Line Search | Theory and Python Code | Optimization Techniques #7

preview_player
Показать описание
In the seventh lecture, we talk about a well-known optimization technique which falls under the category of quasi Newton methods, and is called the symmetric rank 1 (SR1) algorithm. This lecture contains everything you need to know about the symmetric rank 1 optimization technique. I will show you how to use SR1 when combined with the exact line search method. The outline of this lecture is as follows:

⏲Outline⏲
00:00 Introduction
01:06 Symmetric Rank 1 Algorithm (SR1)
04:38 Exact Line Search
05:51 Python Implementation
20:02 Animation Module
35:34 Animating Iterations
40:22 Outro

📚Related Courses:

🔴 Subscribe for more videos on CUDA programming
👍 Smash that like button, in case you find this tutorial useful.
👁‍🗨 Speak up and comment, I am all ears.

💰 If you are able to, donate to help the channel
BTC wallet - 3KnwXkMZB4v5iMWjhf1c9B9LMTKeUQ5viP
ETH wallet - 0x44F561fE3830321833dFC93FC1B29916005bC23f
DOGE wallet - DEvDM7Pgxg6PaStTtueuzNSfpw556vXSEW
API3 wallet - 0xe447602C3073b77550C65D2372386809ff19515b
DOT wallet - 15tz1fgucf8t1hAdKpUEVy8oSR8QorAkTkDhojhACD3A4ECr
ARPA wallet - 0xf54bEe325b3653Bd5931cEc13b23D58d1dee8Dfd
QNT wallet - 0xDbfe00E5cddb72158069DFaDE8Efe2A4d737BBAC
AAVE wallet - 0xD9Db74ac7feFA7c83479E585d999E356487667c1
AGLD wallet - 0xF203e39cB3EadDfaF3d11fba6dD8597B4B3972Be
AERGO wallet - 0xd847D9a2EE4a25Ff7836eDCd77E5005cc2E76060
AST wallet - 0x296321FB0FE1A4dE9F33c5e4734a13fe437E55Cd
DASH wallet - XtzYFYDPCNfGzJ1z3kG3eudCwdP9fj3fyE

This lecture contains any optimization techniques.

#optimizationtechniques #optimization #algorithm
Рекомендации по теме
Комментарии
Автор

Best lecture in quasi-newton method which I have so far found on internet!

kerimetasc
Автор

Using LaTeX generated equations like a boss. Thank you sir Ahmad !

gamingboychannel
Автор

2:50 the animations are very nice. Thank you for taking time to record the lecture.

yaglz
Автор

Hello Ahmad. Many thanks for your support! To be honest, I don't know much about gradient methods. I often use search-based optimization methods in my research such as GA, PSO ...

utpalgaming
Автор

Honestly, this guy is incredible. He explains everything soo precisely and efficiently without any unnecessary information. Thanks a lot for this video. You made my life easier.

walak
Автор

Ten minutes of this video explains better than an hour of lecture in the course I’m taking🤣 thanks for saving my brain!

robloxeren
Автор

It's rare when less viewed video gives best explanation. Your presentations are almost like 3Blue1Brown or Khan academy! Don't know why this video has this less view!!

techguru
Автор

To find this whole course freely available on YouTube is such a gift. Seriously, you cover a LOT of ground.

benvesly
Автор

I am a PhD student and I will be using optimization methods in my research.

awesomegameplays
Автор

I've known this man only for 40 minutes, but I feel like I owe him 40 decades of gratitude. Thank you for this awesome tutorial!

ercansarusta
Автор

This guy is the most underrated youtuber on planet earth.

ardaerennaim
Автор

he did all this hard work and sent it to the internet for free. and he doesn't get too much but what he gets is RESPECT and credit for bringing new aspiring engineers to earth.

bollywoodtalkies
Автор

I can't believe these type of courses are for free here, it's amazing how education has change.

ayuuu
Автор

Hats off ! Ahmad I have no words to let you know how grateful I am for this free course, it is not only well designed but also easy to follow, God bless you.

suleymanozcan
Автор

This course has literally changed my life. 2 years ago i started learning optimization from this course and now i am a software engineer intern at a great startup. Thanks Ahmad !

fatihbiz
Автор

Understandable with example, rather than those who explained long enough using matrix formula only. Thank you 🙏✨

furkanefebayrakc
Автор

this guy, sat for about 1 hour and talked about newton in one video, and then released it for free. legend

zmd
Автор

Dude, I'm less than 2 minutes in and I just want to say thank you so much for creating this absolute monster of a video.

nihathatipoglu
Автор

I'm here from yesterday's 3b1b video on Newton's method of finding roots, after wondering if there's any way to use it for minimizing a function. Mainly to see why we can't use it instead of Stochastic Gradiend Descent in Linear Regression. Turns out the Hessian of functions with many components can turn out to be large and computationally intensive, and also that if the second derivative is not a parabola, it can lead you far away from the minima. Still it was nice to see how the operation works in practice, and you mentioned the same points about Hessians too. Good job 😊👍

origamianddiy
Автор

I have been watching your videos regularly and they are very informative. Thanking you for taking time to enlighten us. Would you mind making videos on conventional optimizationmethods like conjugate gradient methods?

efey