The Hessian matrix | Multivariable calculus | Khan Academy

preview_player
Показать описание
The Hessian matrix is a way of organizing all the second partial derivative information of a multivariable function.
Рекомендации по теме
Комментарии
Автор

if anyone is wondering why the mixed derivates are the same: it's Schwarz's theorem.

francescomura
Автор

Grant <3. Always helping me with math

balajilakshminarayan
Автор

Hey Grant, Love your video's from Khan and 3Blue1Brown!

therealbean
Автор

Awesome video! Thank you! And wow! It's 3Blue1Brown's voice doing this video!

AJ-etvf
Автор

Thank you so much, here I get much more infornation in one day than in university in a month)))

haiarpyzargarian
Автор

You have the same voice as the guy on 3blue1brown

shcraft.
Автор

you are perfect, thanks for you videos, and you're fanny mood :)

catouncormery
Автор

Yo Khan Academy, thank you for making these videos. They are a real lifesaves at times ^w^

jasonsoto
Автор

I love u Khan!! u save me today
<333 very

MohamedJama-zttk
Автор

So is this Hessian matrix is valid only for scalar valued functions right? If my intuition is correct then for a vector valued function of maybe 4 components, would there be 4 Hessian matrices?

anamitrasingha
Автор

Thank you sir, this video has given me a good idea

subashsubashsubash
Автор

awesome! :-)
i have a question
what kind of tools are you using when you work??

I really wanna get that blackboard tool :-) thx in advance

im-alida
Автор

What does the Hessian matrix represent geometrically? In particular, what does the determinant of the Hessian matrix measure?

debralegorreta
Автор

what if the output of function f was a vector of 3 rows instead of a single expression. How would the hessian change?

hussainbhavnagarwala
Автор

the moment I clicked on this link, oh this is the 3blue1brown guy!

siyuzhang
Автор

Can someone explain why the ideal learning rate for 2 or more dimensions in the gradient descent algorithm is the inverse of the Hessian (matrix of second partial derivatives)?

BayesianBrain
Автор

Sal, if I have 1 equation and 6 independent variables, my partial first derivatives is a vector with 6 terms. If I follow, the Hessian will be a 6x6 matrix. Is that correct? Thanks!!! I contribute to you as your program and platform makes an amazing contribution!

rubyemes
Автор

how do i know which channel of khan academy is for this video?

vsprwlz
Автор

Do you guys know which lecture/series/playlist is this video from? Please let me know! Thanks!

wunanzeng
Автор

Good day, I was wondering whether you know any python library that has implemented second order gradient descent with hessian error matrix. If you can point me to the right direction, it would be very helpful. Thanks in advance, Kind regards
Shantanu

shantanu_bhattacharya