Tensors for Beginners 11: Linear maps are Vector-Covector Pairs

preview_player
Показать описание
Error around the 2:00 mark--I wrote that basis vectors are contravariant and that basis covectors are covariant. The opposite is true. Basis vectors are covariant and basis covectors are contravariant.

I realize the audio quality in my videos is pretty bad. I've ordered a microphone which will hopefully remedy that.
Рекомендации по теме
Комментарии
Автор

"To get a more interesting linear map, we need to combine a bunch of pure linear maps together".

That's such an insightful result that is hidden in plain view. Surprising what you can figure out if you are willing to think a bit of what you're actually writing down.

Thanks for a very insightful video

himme
Автор

It's a lovely series, thanks for the effort you obviously put on this. Looking forward to more

signorellil
Автор

I am watching all your videos and they are excellent. In the past I had to study tensors but did not understand tensors at all, but with your videos I do, thanks for making these videos. But back to watching your videos, really exciting, will probably have to watch them again.

gummybears
Автор

These videos are extremely enlightening. I benefit greatly from the derivations because I can physically see how each concept builds upon and relates to one another.

xandersafrunek
Автор

Hey Chris. Your series of videos are awesome: well-timed, nicely arranged, and--the most important of all, for me--crystal clear + intuitive explanation.

A tiny bit of typo, I suppose the indices of covectors basis in 7:31 are supposed to be upper, but everybody knows it so no problem, I guess.

AhmadNasikun
Автор

Enjoy these tensor videos. Clarity w/o being pompous. Presentation is straightforward. Thank You!!!

joecerniawski
Автор

My goal is to understand the Einstein general relativity formula.I hope with this series of videos i can. Thanks for making things easy

agelosbedini
Автор

EigenChris et al, the determinant of a 'Pure' matrix is zero.

Great tutorial here, especially how you show non-zero determinant linear maps.

Mikey-mike
Автор

This 12 mins was the most condensed time in my brain life.

kimchi_taco
Автор

Interestingly, for the matrix at the 3:20 mark, a =, b = 4, c = 2 and d = 200 also breaks the matrix into column and row vectors, so the answer is not unique!

Jekku
Автор

It's interesting to combine the geometric significance of eigenvalues and eigenvectors with linear transformations of the "pure" matrices here!

kiolmatsu
Автор

I reckon a lot of viewers have seen this idea before in the rank one sum decomposition (ROSD) of a matrix as it relates to a singular value decomposition (SVD) of the corresponding linear map. Every linear map has an SVD. But if I understand correctly, this video is saying that the ROSD isn't just a curiosity that all linear maps enjoy. That is, rather than thinking of ROSDs as an expression of linear maps, we can give sums of outer products pride of place, and we can think of such sums as *generating* our linear maps. So, the new perspective is that sums of outer products give us linear maps, and not the other way around (i.e., not linear maps giving us ROSDs).

ryanj
Автор

I think there is a mistake at 01:49 where the basis (covariant) and dual basis (contravariant) are swapped in the diagram. The indices position is however correct.

supermegauberful
Автор

Thanks a lot. So much basic knowledge are nicely covered in your series. I really enjoy it.

chenvinc
Автор

I really liked the phrase "boring" linear maps.

MesbahSalekeen
Автор

This nonstandard notation harkens back to the beginnings of tensor analysis laid down by Josiah Willard Gibbs and placed into a textbook by Edwin Biddle Wilson. This is dyadic notation where the placement of two vectors adjacent to each other without an intervening dot or cross was a multilinear form. (p 265 Vector analysis:
a text-book for the use of students of mathematics & physics: founded upon the lectures of J. W. Gibbs, E. B. Wilson, 1902, Scribner, NY)

ronsmelser
Автор

Chris, your videos on Tensors are very helpful, plz keep up the good work...if possible kindly share with us lecture notes if you have in pdf form or any other form..thanks

rahmatkhan
Автор

Thanks as usual for a wonderful video. So here is how I like to think though. In terms of full rank and rank deficient matrices and the fact that n n by n matrix with full rank (=n) can be written as a sum of n rank 1 matrices (column-row product). The first matrix is rank deficient and so can be written as 1 column-row product while second matrix is full rank and needs the sum of 2 such products.

tejasnatu
Автор

At 3:50 a more elegant proof is available: two ways of obtaining a.b.c.d must coincide (equivalent to 0 determinant without saying it).

Krokoslav
Автор

Your videos are very helpful.
Just a minor point of clarification at 9:32. I think that the linear map here is L^T rather than L. After writing out the matrix representation of L, my understanding is that the lower index of tensor L here refers to the column number of matrix L^T, so the linear map is L^T.
From previous videos the lower index of a (1, 1) tensor such as F or B refers to the row number of the corresponding matrices.
In the context of this example, this is not important as you are just showing that you get a linear map from a vector-covector pair.

eamon_concannon