What Lies Between a Function and Its Derivative? | Fractional Calculus

preview_player
Показать описание
Can you take a derivative only partway? Is there any meaning to a "half-derivative"? Does such a concept even make sense? And if so, what do these fractional derivatives look like?

Previous video about Cauchy's Formula for Repeated Integration:

A really nice video that derives the gamma function from scratch:

=Chapters=
0:00 - Interpolating between polynomials
1:16 - What should half derivatives mean?
3:56 - Deriving fractional integrals
8:22 - Playing with fractional integrals
9:12 - Deriving fractional derivatives
13:53 - Fractional derivatives in action
16:19 - Nonlocality
17:54 - Interpreting fractional derivatives
18:51 - Visualizing fractional integrals
22:10 - My thoughts on fractional calculus
23:10 - Derivative zoo

===============================
MAIN SOURCES USED FOR THIS VIDEO

Podlubny, Igor. Fractional Differential Equations: An Introduction to Fractional Derivatives, Fractional Differential Equations, to Methods of Their Solution and Some of Their Applications. Academic Press, 1999

Podlubny, I.: "Geometric and physical interpretation of fractional integration and fractional differentiation." Fractional Calculus and Applied Analysis, vol. 5, no. 4, 2002, pp. 367--386.
- (for the visualization trick for fractional integrals)

- (for the zoo of alternative fractional derivatives)

===============================
Minor correction: The footnote at 7:34 should say the trig substitution produces another *whole* factor of pi (not a root pi) in the numerator which then cancels the *two* root(pi)'s that appear in the denominator from applying the half integral formula twice.

===============================
CREDITS

===============================
Thank you for your support!

===============================
The animations in this video were mostly made with a homemade Python library called "Morpho". If you want to play with it, you can find it here:

===============================
This video is part of the 3Blue1Brown Summer of Math Exposition 2 (#SoME2). You can find out more about it here:
Рекомендации по теме
Комментарии
Автор

When I was a high-school kid I tried to derive fractional derivatives and integrals, but I didn't have sufficient knowledge to succeed at the time. I never thought about it later, even though I studied math at university. Until I saw this video. What an excitement you gave me by making it! Thanks a lot!

SurfinScientist
Автор

your analogy between comparing fractional calculus to integer calculus and interpreting e^ipi as repeated multiplications of e is perfect

sharpnova
Автор

At 7:34, really the sqrt(pi) on the inside and the outside combine into a full pi in the denominator, which then would presumably cancel with a pi in the numerator generated by a trig substitution required to handle (t-x)^(-1/2). Trig subs love to happen when you have simple square roots of the integration variable in the denominator, and where there's trig, there's pi.

rarebeeph
Автор

I'm happy the algorithm recommended this awesome video. It's like discovering another dimention. It's that moment you realize something absolutely new and your brain celebrates it like a new birthday.

sekrasoft
Автор

Wonderful video! Is it possible to take complex-valued derivatives? What would they mean?

karan_jain
Автор

Here's a signal processing perspective: The derivative operator is a linear filter with frequency response given by the identity function. To find the half derivative, simply use a filter whose frequency response is the square root function. Of course, the tricky part is to define Fourier transforms of arbitrary functions in a meaningful way so one can apply the frequency response. I guess one can use windowed versions of the functions, then let the window width go towards infinity.

siquod
Автор

2:54 Here, I found the formula c(a) = a!/(a-½)! for the coefficient, where a is the exponent and the factorial is expressed in terms of the Gamma function: a! = Г(a+1). Also it can be extended by replacing ½ with any fraction or number you want.

quintium
Автор

Wow, this is beautiful. Really demonstrates how mathematicians are able to extend concepts beyond their original domain.. also shows how doing so can eliminate structure. Here we see the geometric meaning (kind of) disappears..

Mutual_Information
Автор

The derivative and integral operators can be seen as smoothing and non-linear frequency scaling. Take the FT of your derivegral and you will essentially get a spectrum modification. For integer parameter the frequency "lines" up and so it enables constructive and destructive interference to properly take place canceling all the non-linearity.

That is, you have what is essentially a convolution and then derivative and when you take the FT of such a thing you end up with a power scaling relationship w^(p-a) modifying the original spectrum.

The point here that the non-local behavior is due to the process actually working in the frequency domain and it just simplifies for the integer case. The interpolation requires certain constraints at integer values so the line up with our traditional usage... hence the derivegral is a generalization that simplifies to our basic operators. It's just one form of interpolation as there can be no absolute generalization since any generalization can work. Hence the "interpretation" of some fractional derivative is going to simply be the specific mechanism in which the transform was designed.

kodfkdleepd
Автор

I remember asking my AP calculus teacher the same question, then my calc 2 professor as an undergrad. Three years later as a side project in my second semester of real analysis, I dug into it and even wrote a paper. Loved this topic.

magicianky
Автор

first time viewer here. this video is incredible. thank you so much! despite having done three mathematics degrees, i never learnt or used fractional calculus. this video is such a beautiful summary. i would donate if there was a "Thanks" button. my only feedback is when you make little aside notes in the corner, please just keep them on the screen a couple of seconds longer. i will be following your channel closely, i hope it gets captured by the youtube algorithm!

inverse_of_zero
Автор

The options for fractional derivatives remind me of Euclid's 5th axiom of geometry. Euclid hated that he had to explicitly state that parallel lines never intersect, but what he didn't realize is that this was required differentiate flat-plane geometry from hyperbolic and elliptic geometries. Had mathematicians discovered a system isoomorphic (is that the right word?) to his first 4 axioms but not in the context of geometry, then we would similarly find ourselves with multiple options for extending the theory.

josephrissler
Автор

It's been years since I actually did any integration (to help my son in high school calculus), but your explanation was very lucid and easy to follow. Makes me think that I've still got some pretty good math chops.

DavidRTribble
Автор

This channel is super underrated! Your animations are beautiful and the narration is incredibly clear!

dirichlettt
Автор

What a fantastic video and new math channel.

I taught fractional finite difference operators in a time series analysis class and one of the things that the text mentioned was exactly the "no interpretation" issue. They said "there's no good interpretation to this, but it models long memory processes well, so... there it is."

crimfan
Автор

I watched the whole video twice today. I just feel the need to say THANK YOU! This is at the same time beautiful, mysterious, fascinating and educational. Great value of my time. Please keep going!

alessandrocattapan
Автор

7:36 You can normalize the integral via *x := t/2 * (u + 1)* to get rid of *t* and obtain a symmetric integration domain. The remaining integral to solve is

*\int_{-1}^1 (1 + u^2) / \sqrt{1 - u^2} du = 3𝛑 / 2*

A second substitution *u := sin(v)* will yield the result.

*Rem.:* Thank you very much for this introduction to fractional analysis! It's amazing how similar the concept is to the extension of the derivative to generalized functions (aka _Schwartz' Distributions_ ).

carstenmeyer
Автор

I first learned about fractional derivatives watching Dr. Peyam's videos a few years ago. Since then I've been working on applying more than just square roots to differential operators. From what I've seen, they're just a really bad class of functions to do this with, because all the fractional monomials have these branches and asymptotes, it's all very messy.
Exponential function of the derivative operator? Classic, that's your shift operator. Reciprocal of a linear function of the derivative? Generalized Laplace transform! It all works surprisingly well. Using just the shift operator definition and regular derivatives, then some fairly predictable rules for how to translate things into and out of integrals, you can work up a healthy repertoire of functions applied to derivatives.

My crown jewel so far in all of this was uncovering a super secret identity! It's sort of like the mother of all generalizations of the product rule, way beyond the generalized Leibniz rule.
As symbols,
[f(D_x)] (g(x) * y(x)) =
[ [g(D_z + s)]_{z=D_x} (f(z)) ]_{s=x} y(x)
In words, for any function-of-derivative 'f' taken of a product of functions, one of those functions may be taken out, and applied as a function-of-derivative _of_ 'f.' There is the quirk in that, given away by the use of a dummy variable 's;' this is to ensure the operator is well-defined, because you really shouldn't mix the variable you differentiate with respect to in the operator itself, as it may become unclear with these highly non-linear function-of-derivative operators what's differentiated when. Technically there would be no notational problem, and I've over-notated the issue especially by using square brackets to denote these function-of-derivative operators, but alas.

Within these studies, I've come closer and closer to difficult problems of little importance, as well as stumbled upon the triviality of finding an operator whose eigenvalues are the zeros of the Riemann Zeta function. Alas, I do not have the knowledge to determine or construct a space for such an operator to also be self-adjoint. It's not an easy topic to just dive into, especially without having taken a class beyond differential equations.

One of those curious problems of little importance is finding a non-trivial differential (that is, a function-of-derivative) equation whose solutions include the gamma function. I've actually gotten really close! The equation:
[e^e^-D_x]y(x) = 0
should have the gamma function as a solution. I devised this using that above identity, specialized for a particular case (I've forgotten the original derivation of how to do this) of an operator where somehow you multiply by the independent variable.
You see, this is actually impossible, because all of these 'function-of-derivative' operators (perhaps excluding peculiar cases of non-meromorphic functions, see what I said before?) are completely linear. They commute with each other. This is not conducive to having an operator where you put in a function of and get out x times that function. The problem is that this operator does not commute with the derivative:
x * [D_x] f(x) = xf'(x)
[D_x] (xf(x)) = xf'(x) + f(x)
This is one of the reasons I don't like mixing the independent variable in the function-of-derivative operator, because it breaks commutativity.
But I just said I did that. How? -Magic, basically.- I cheat by using a strange trick (which, as I mentioned before, I don't remember off-hand, but iirc it is based on that identity I presented at the beginning of this comment) where, given an operator whose 0-eigenfunction (eigenfunction of eigenvalue 0) is 'f, ' I can find a new operator whose 0-eigenfunction is x*f.
By arranging the terms right, the logical next step was to try and apply this to the functional equation for the Gamma function. I forget exactly, but iirc this should yield [e^e^-D_x]. This _almost_ works! There's a very non-rigorous but highly conclusive way of getting from this operator to the gamma function using a certain method (solving these kinds of DEs is often easy because it's just a sum of exponentials whose exponential coefficients are the zeroes of the operator (A method you may be familiar with for LDEs, which can be proven to generalize as such). The trouble is that e^e^-z has no (finite) zeroes, so you have to use a really tedious method that uses integration of a manipulation of what would normally be called the characteristic function, but is identically here the function that the derivative operator is taken of, and it's a hassle.) Except, it's not this operator, it's its evil twin that's out by a sign error. I don't remember exactly how it goes, but you can see it work flawlessly for that evil twin, and diverge for the one you arrive at correctly for that method I mentioned earlier. This frustrating issue has had me scour every step of what I've explained for a sign error, to no avail.
If you don't want to work through all of this stuff that I have (inadequately) explained to see that the method really does _almost_ work, simply take the integral definition of the gamma function, and do a particular u-substitution, I think it might be u=e^-t, or maybe it was t=e^-u, and watch as you get this peculiar double exponential appearing. This is exactly what you'd expect for a function the solution to this function-of-derivative equation, and it is what you get by applying this integral of characteristic function method to that evil twin operator.

On the other side of things, trying to evaluate these extremely strange function-of-derivative operators is hardly possible. Actually, it's fairly straightforward to do it for any function-of-derivative where the function has a definite integral representation, by which I actually mean where the independent variable isn't in the bounds. (The variable being in the bounds would obviously be pointless, because you replace the variable with the differential operator, and I have absolutely no definition for an "integral from 3 to the derivative operator"!) Many functions have such a representation, like the Gamma Function (a coincidence from earlier, that has no application as yet to the earlier problem) to take Gamma(D_x), the Riemann Zeta function to take Zeta(D_x) (this is related to but not sufficient for what I mentioned earlier about an operator whose eigen_values_ are the zeroes of this function), and indeed 1/(s-D_x) is equivalent to the laplace transform, but with the original independent variable still there.

To give you a taste of how most of this works, I'll derive that last one, because it's a lot of fun:
Notice that the integral from 0 to infinity of a negative exponential is the negative reciprocal of it's exponential coefficient, that is:
int_{0, inf} e^-zt dt = 1/z
We can use this as an integral definition for the function 1/z. If we alter this nifty function 1/z, we can get the rather versatile function 1/(s-z). Looking back this yields:
int_{0, inf} e^(z-s)t dt = 1/(s-z)
Replacing z with a differential operator D_x, we get
[ int_{0, inf} e^(D_x-s)t dt ] = [ 1/(s-D_x) ]
which is a perfectly typical construction. Notice the square brackets [ ] on the outside of the integral, which denotes that the integral is taken first, then function-of-differentiation is evaluated. We can relatively freely move those brackets inside for use as a definition for the simple reason that I've not bothered to make rigorous when you can't do that beyond just "whenever it would be okay for a perfectly linear operator." When we do this, we should separate the exponential terms out to get a better look:
int_{0, inf} e^-st * [e^tD_x] dt = [ 1/(s-D_x) ]
Now, as a function-of-derivative operator, I would leave it here, but to see why this is a generalized laplace transform, we should test it out on an arbitrary function 'f.'
[1/(s-D_x)] (f(x))
= int_{0, inf} e^-st * [e^tD_x]f(x) dt
The coolest part of all this study is how commonplace the fact is that the exponential function of a derivative is actually the shift operator, which otherwise is relegated to the characteristic functions in the niche subject of Delay Differential Equations.
= int_{0, inf} e^-st * f(x+t) dt
This is our familiar Laplace Transform. Except... isn't it supposed to be just f(t), not f(x+t)? Hehehe, indeed. Quirky, eh?
(You can show this works as a definition by applying [s-D_x] to one of these generalized Laplace Transforms on your favorite functions. It's cool! And has effective but disappointingly limited and tedious applications to solving typical constant-coefficient LDEs.

I hope this small youtube comment made some maths enthusiasts a little more intrigued in the peculiar side of calculus.

PeterBarnes
Автор

Wow, this felt like my graduate level math class wherein we define stuff that we don't really understand and point out the different strange properties it satisfies.

mohithraju
Автор

Super cool video, I'd love to learn more about it! In one of my courses, I learned about fractional PDE's for diffusion, and it blew my mind. Can't wait for the next videos!

bastienmassion