Linear Algebra Derivative

preview_player
Показать описание
In this video, I'm do something really cool! I calculate the derivative of a function without using calculus, but using linear algebra instead. Well, almost without calculus, since I still need to know the derivatives of 1, x, x^2. This little exercise is a beautiful illustration of the interplay between linear algebra and calculus, and is probably how your calculators use linear algebra to calculate derivatives. Enjoy!

Рекомендации по теме
Комментарии
Автор

And kids, that's the powerful idea
what we use in Control Systems
for State-Space Representation.

o_-_o
Автор

Again, I really loved your video Dr. Peyam. And funny enough last semester in Linear Algebra I I had stumbled upon this fact myself when discussing diagonalized Matrizes with my friend.
I would have enojied if you would have found a way to define the entire differentiation just using tools from linear algebra though (like, why x^2 -> 2x and stuff).
But this was just as entertaining and really rewarding to see, I had the right idea before

HDQuote
Автор

I loved this video. You are a great mathematician. Thank you for sharing your knowledge.

Автор

I love that you make the distinction between a matrix and a linear transformation. Too few people don't. By the way, I thought at first it would be about the derivative of a linear transformation. Does it make any sense to derive a linear transformation and does it have any neat matrix expression?

jonasdaverio
Автор

The nonsquareness of the matrix (including the vector isomorphisms if one cares) implies that the derivative is non-invertible. Which is, of course, mostly true: we lose information about the constant, which is why the inverse operator integration) produces a family of functions and is why one must never forget the C!

curtiswfranks
Автор

Please make a video on proving Stirling's approximation. I found it when solving a sum from n=1 to infinity((nlog(2n+1)/(2n-1)) - 1). I tried to solve it using limit of a sum but failed only stirling's approximation worked.

ball
Автор

is there a reason to write фᵦ(v) instead of just [v]ᵦ ? and similarly for SF(v), wouldnt you just write that as Av ?


at the end the identity would just become: Av[v]ᵦ = [v]ᵧT

rigorless
Автор

may be can use this for derivative function multiple parameters in the direction custom vector

Archik
Автор

look also at Eigenchris series on tensor and differential as vectors

imrem
Автор

Thank you for this video! I loved it! Algebra is really cool .

vukstojiljkovic
Автор

Close. But amazing. I thought the answer was 3.1415...m (or pi m)

mcmage
Автор

I don't know much about spaces with infinite dimension, but I've heard that they exist. Could this example be extended to analytic functions that would require matrices with infinite basis vectors?

jeffrussert
Автор

How do we know the codomain would be P1? Why not P2? Assuming no prior knowledge of the power rule.

gerbenkoopman
Автор

Is it work for chain rule dr?
But i can't remember the steps. :D

shandyverdyo
Автор

OK cool, can you then integrate by the inverse of this matrix? or what is the transformation matrix for improper integration? Can you put limits and do proper integration?

Behroozifyable
Автор

very elegant, what about with functions like sin or cos?

felipegomabrockmann
Автор

Dear Dr. πm,


I'm a huge fan of your videos, I find them to be really well done and engaging! You and blackpenredpen as well as other mathematics youtube channels have inspired me to start me own maths channel; I currently have 150+ subscribers, and I do relatively long form videos that really dig into the details of the concepts. So far, I've done videos on trigonometry, and will be branching out and using these facts to prove other lovely results.


I'm sure that as an educator, you are quite a busy man. If you happen to be interested and have some free time, I would be most honored if you checked out my videos, and gave some feedback. Thanks for all the lovely content!

WhattheHectogon
Автор

Or one could use the R-algebra structure of the ring of polynomials P, and define a "derivation" (see [1]) as a R-linear endomorphism of P (say, d : P -> P) satisfying Leibniz rule, which is determined by the image of X, that is, by dX. Fun part is that dX can be any polynomial since by Leibniz rule it turns out that the relation d(X^n) = n * X^(n-1) * dX holds, which means that the derivations on P are... Isomorphic to P itself!

Leibniz rule hides a very mysterious behaviour.

FractalMannequin