Numerical Optimization Algorithms: Gradient Descent

preview_player
Показать описание
In this video we discuss a general framework for numerical optimization algorithms. We will see that this involves choosing a direction and step size at each step of the algorithm. In this vide, we investigate how to choose a direction using the gradient descent method. Future videos discuss how to

Topics and timestamps:
0:00 – Introduction
2:30 – General framework for numerical optimization algorithms
18:41 – Gradient descent method
32:05 – Practical issues with gradient descent
36:53 – Summary

#Optimization

Рекомендации по теме
Комментарии
Автор

AE501 : Good introduction to the algorithms and the descent method, makes the process for calculation very clear and straightforward.

ryanmeinhardt
Автор

AE 501 - Johnny Riggi: Good to see that we are still using the gradient function to break down complex problems. Proving that all of math builds on itself!

GiovanniRiggi-jq
Автор

AE501: As always, you were able to explain a difficult concept such as the gradient descent method in a very clear way. Thank you!

ahmedashmaig
Автор

AE501: Your explanation of the numerical optimization process made it sound so simple! The gradient descent method makes a lot of intuitive sense in finding a local minima, especially with the visualization of descending a mountain in the direction of the local steepest downward. I can also see why selecting the appropriate step size is important for finding the optimization solution. Thanks, Professor!

aimeepak
Автор

AE501: Good introduction of the gradient descent method.

Chuan-YuTsai
Автор

AE 501: Super helpful examples and very useful concept

joshshort
Автор

AE501: The graphical representation with numbers was a very great way of understanding gradient descent! Wonderful lecture!

elijahleonen
Автор

AE512: Very powerful use of an extremely simple concept - very cool

chayweaver.
Автор

AE 501: Wow very interesting how a simple concept such as the gradient can be used for optimization. Definitely opened my eyes to the various and complex optimization techniques that exist. Thank you!

Gholdoian
Автор

AE501: While I had thought about inefficiency in looking for the min I hadn't even thought about the possibility of jumping over the local min completely, good thing to keep in mind for sure. -Maggie Shelton

maggiesplantgirl
Автор

AE501 - This playlist is really thorough and helpful; will be referring back to this even after the course has ended. Thank you!

philipcasey
Автор

AE501 You always do a great job in explaining these topics. Optimization is so cool.

lukewideman
Автор

AE501: Thank you for introduction to the gradient descent method.

KennethWright-kh
Автор

AE 501
Great video, broke the concept down and made it easy to understand.

edwardmau
Автор

AE 501: the hiking topography type map was a very helpful example to understand this concept!

JulieWarchol
Автор

AE501: Thank you for this great lecture, really helps understand the overall ideas!

JustinYoung-jf
Автор

AE 501, this is quite neat and seems useful for many situations

inderbhangal
Автор

Thank you very much for that big picture at 14:00, I would be lost without that
AE501

nghihoang
Автор

Really great lecture Dr Lum, this topic has had some personal relevance for me lately, so as always, thank you!! Hope you and yours are well.

wiloberlies
Автор

AE501: Your pseudo code for numerical optimization algorithms makes a great "cheatsheet". Thank you!

ethanngo