Nonlinear Least Squares

preview_player
Показать описание
Finding the line of best fit using the Nonlinear Least Squares method.

Covers a general function, derivation through Taylor Series.
Рекомендации по теме
Комментарии
Автор

You explanined in 10 min what my proffesor could't in two weeks

joseguerrero
Автор

Thank you for your detailed explanation of this method. After learning this method, I tried to write a code to do this example that you introduced here. As you rightly mentioned, initial guess is very important, particularly for c2. By looking at the data, we can have an idea of c2, that is related to the period of the cosine curve. If c2 is guessed too low or too high, the algorithm won't converge, it must be close to the true value...Thank you again for your great video. Educational, detailed, and very helpful.

SalehGoodarzian
Автор

Sir, you are amazing! I wonder how you would explain the Gauss-Newton and Levenberg-Marquardt algorithms as well. Thank you!

victory
Автор

What a good explanation? Really got the point Prof.

morrismbuba
Автор

thank you man, that was an excellent explanation

gththcoc
Автор

Thank you so much! Can you please explain more about Gauss-Newton and Levenberg-Marquardt algorithms?

nmana
Автор

I'm not sure if this is a typo but you write "t" instead of "x" consistently throughout the video.

matthewjames
Автор

You have y(x) yet x is never in equation on the right side ?

GuideEver
Автор

hey there. great video. just wondering how many parameters should we put? does it really matter no matter how many parameter we put in?

haniffshazwanss
Автор

That was very helpful ; Thankyou very much, , ,

elsbbbb
Автор

Can you do this for a function with 2 variables e.g f(x, y;a, b) and g(x, y;c, d)

alb
Автор

This was really helpful sir. Thank you :-)

rishabhsangal
Автор

How do you make initial guess for the Ci prime vector (3:40)? Do these values affect the solution convergence? Thank you.

konstantinmetodiev
Автор

Nice Aggie Ring! Thanks & Gig em AERO 23'

johnyoung
Автор

isn't this just gradient descent?

samuelkushnir