Linear Algebra 15c: The Reflection Transformation and Introduction to Eigenvalues

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Thanks for sharing, Dr Grinfeld! Your lectures never fail to live up to the name of your channel. By not allowing them the cheap luxury of logically unwarranted shortcuts, you turn your lectures into a riveting drama in which we are faced with a challenge that requires the finest of virtues to overcome, like in all great works of art.

lm
Автор

I think this is one of the best and most intuitive introduction I've seen on eigenvalues and eigenvectors! Thank you!!

albertlee
Автор

Took my first Linear Algebra Exam got a 19/60...but after watching these videos I got 53/60 on my last exam. Thank you!

marlonguerrero
Автор

Finally found a teacher so good to explain concepts intuitively that too without animations and stuff .

visheshmangla
Автор

Hey Dr. Grinfeld, I have been telling my son for years to watch Strang’s videos and that he was possibly the best math teacher, but recently I discovered your channel and felt that your teaching is so good that it is often just as good as Strang’s or in some cases maybe even better. Then I just found out that you were his PhD student at MIT so I guess now all kind of makes sense. You’re making the world a better place. Thanks.

gw
Автор

11:08
I just learned a new word while learning linear algebra.

mel·lif·lu·ous

adjective
(of a voice or words) sweet or musical; pleasant to hear.

kf
Автор

This is incredibly helpful. Gigantic Thank :))

kategrigorieva
Автор

It looks like one can rewrite Euclid's Elements using linear algebra methods, and perhaps gain insight into linear algebra and geometry.

maxpercer
Автор

Отличные лекции, но хотелось бы побольше алгебраических доказательств....

aleksanderaksenov
Автор

For the vector orthagonal to the reflection axis, how is the egen value only -1? You could multiply the reflected vector by ANY scalar, and it would still be parallel to the original vector, just stretched or shrunk. Shouldn't it be a whole eigen space, just like the vector parallel to the reflection axis?

tangolasher
Автор

So point reflection on R2 would have have the eigenspace of R2? Since the point reflection of a vector v is - v, PR(v)=- (v)
So every vector in R2 is an eigenvector and -1 is the eigenvalue.

If you alter the eigenvalue to -n you alter the transformation to nPR. Sorry, just ranting.

Hmm.. A question: are there any other transformations, besides nPR, which's eigenvectors are the whole set?

ptyamin