Least Squares Approximations

preview_player
Показать описание
Description: We can't always solve Ax=b, but we use orthogonal projections to find the vector x such that Ax is closest to b.

This video is part of a Linear Algebra course taught at the University of Cincinnati.

BECOME A MEMBER:

MATH BOOKS & MERCH I LOVE:
Рекомендации по теме
Комментарии
Автор

Dude- you are honestly a God-send, this helped so much, thank you.

elizabethautumn
Автор

Thank you for explaining the underlying mechanics of the process! This was super helpful!

NguyenTran-snvt
Автор

Hey Dr. Bazett, Thank you so much for this series. I'm a second-year mathematics student at Imperial College and this has really helped to summarise a lot of the material that I have learnt this year! Really appreciate how clear and well-explained this course was so again thank you!

tarunmistry
Автор

This helps so much with my understanding, nice explanation, thank you!

vp
Автор

you are amazing teacher, thank you for saving my time and going forward to the point💖💖

codercodes
Автор

Is the orthogonality is needed in the expansion in 1 below the LSA? If the set {a_i} is not orthogonal, then I don't think the right-hand side is the orthogonal projection of v onto W.

NguyenHoang-jwyu
Автор

On 6min50. Do the vectors a_i need to be orthogonal, why or why not ?

alaindevos
Автор

Here's the follow up for the people who ended up here without following the playlist:

bodhi_db