Normal equation solution of the least-squares problem | Lecture 27 | Matrix Algebra for Engineers

preview_player
Показать описание
How to solve the least-squares problem using matrices.

Рекомендации по теме
Комментарии
Автор

this is the best description of using matrices to solve Linear Regression I have ever seen. With basic to intermediate knowledge of Linear Algebra Chasnow explains each step in clear detail and then rounds things of with a trivial example to explain all the math behind the powerful method of Linear regression. This man is both a Maths and communications genius. Greetings from Germany. Your students are incredibly lucky, you look like a professor who is determined to make sure all students understand the topics rather than one who wants students to marvel at their own math prowess.

ajsctech
Автор

Thank you Prof. Chasnov, we appreciate your good work.

johnedakigimode
Автор

thanks a lot mr. Chasnov ! really saved me hours of mental breakdown

idanmalka
Автор

Thanks a lot Prof Jeff for valuable lecture.
@4:25 : Why AT A is invertible matrix - as per last lesson Invertible matrix which has inverse (AA-1=I), kindly clarify what make AT A invertible matrix?
@5:21 : Projection Matrix (A(ATA)-1 AT - Is there any paper illustrate it as i think we didn't come to it last lectures .

noureldin
Автор

Professor, I still have one question you mean b-b_proj is orthogonal to Matrix A, wheteher do we know that at the beginning B is out of the column space of A, so b and b-b_proj is in the kernel of Matrix A, b-b_proj becomes just short or long

lancelofjohn
Автор

You made it look easy; thank yo so much, Professor Chasnov!

yuanzhuchen
Автор

Many thanks!!! You certainly have a huge talant for teaching as you know where the breakes in understanding sit to be given a particular attention!

Sergei-ldiv
Автор

at fisrt you start with
1) Ax=b
and then you say
2) Ax = b(proj)
you multiplied both sidesof (1) by A transposed, and then use (2). can't understand how you can use both in the same proof.

EB
Автор

according to what you did, you found the slope and intercept for 3 points on a graph - how is this related to the least squares problem
you never got around to the difference of ( y - y' ) where y' is the value lying on the least sqaures line.
i hope this question makes sense

neepamgandhi
Автор

How does multiplying a vector in null space by transpose of that matrix get rid of the vector in the null space

calculusguru
Автор

Is this similar to finding a "regression plane"?

chrischoir
Автор

Why would the columns of A typically be linearly independent in a least squares problem? (4:10)

williamwilkins
Автор

Hello professor, thanks for your video. But I have one question if the Ax=b is overdetermined, why can we still use "=" instead of Ax>=b or Ax<=b?

lancelofjohn
Автор

Did you reflect the video or are you really writing backwards?

VoidFame
Автор

At 10:08 why is b equal to y? I have seen the normal equation as A^T * A * x = A^T * y which im confused by because of the y in the equation

nped
Автор

Why do you use the normal equations to find x and not just directly use the x = (A'A)^-1 A' b equation you'd derived already?

williamwilkins
Автор

At the end with B0 = 1 and B1 = 1/2. If I plug in X = 1 then Y = 1 + 1/2 * 1 = 3/2 and not 1 like in the data. Same for other two datapoints. Am I missing something?

juhu
Автор

Isn't A(AtA)^(-1)At just the identity matrix?

martinsanchez-hwfi
Автор

why is y = 1/2(x) better than simply y = x? Plotted it seems like y=x fits better

TylerMatthewHarris
Автор

Great video... But there is a set of ounnecessary confusions:
You name the vrctor of beta_i as x, matrix containing x_i as A, and vector from y as b.
:/
Most people are lost at that point. :/

BlackHoleGeorge