Advanced Linear Algebra - Lecture 41: Low Rank Approximation and Image Compression

preview_player
Показать описание
We introduce the Eckart-Young-Mirsky theorem, which says that the singular value decomposition (or, equivalently, the orthogonal rank-one sum decomposition) can be used to find the closest low-rank matrix to a given matrix. We then show that this theorem can be used to (lossily) compress an image.

Please leave a comment below if you have any questions, comments, or corrections.

Timestamps:
00:00 - Introduction
03:12 - Eckart-Young-Mirsky theorem
05:46 - 3x3 example
09:49 - Image compression
Рекомендации по теме
Комментарии
Автор

r/math led me here and this is great. This topic was briefly touched during a quantum information course I attended last year when dealing with renormalization group techniques and indeed the core idea is the same, keep the largest populations/eigenvalues/what have you to extract the most information with a given amount of resources

dzanc
Автор

That was the best linear algebra lecture series I ever watched and the best I will ever watch! Thank you!

JohannesSchmid-hmic
Автор

Went through your introductory and now advanced LA playlist, very informative, thank you!

michaelmuller
Автор

Another excellent lecture!!! The last minute of illustration on the color banding effect blew my mind!!! Thank you so so much!

supersnowva
Автор

Im here, Going to listen again. Best linear algebra teaching i have seen so far

quantabot
Автор

I've been watching your videos since the first one of introductory L.A. all the way to this one, and thanks to you I feel really excited about how this is related to AI/ML/DL

weneedlittlepatience
Автор

Thank you! You really helped clarify my Matrix Theory course!

caseyj
Автор

Thanks for the amazing explanation !!!! Now the subject feels interesting

vijayarana
Автор

Hello, Nathaniel, wonder explanation for the LoRA application in image compression. For R=1, it is easy to understand that U1 is the x-axis and V1 is the y-axis which are orthogaonal to each other. I am wonder how R=2 (only), R=3 (only) look like ? which U2 (U3) and V2 (V3) must represent certain diagonal patterns in the 2-D image. Thank you👍

jamesyang
Автор

Hi, I know this was a while ago, but do you plan on doing any lectures on multilinear/tensors?

DrAndyShick
Автор

THANK YOU!!! I was able to finsh my final project after watching this lol

FuzzyBagels