L20.10 Maximum Likelihood Estimation Examples

preview_player
Показать описание
MIT RES.6-012 Introduction to Probability, Spring 2018
Instructor: John Tsitsiklis

License: Creative Commons BY-NC-SA
Рекомендации по теме
Комментарии
Автор

Thank you for this amazing video!! But i have a quick question: why you are trying to MINIMIZE the negative of that function instead of directly MAXIMIZE that function?

ireneisme
Автор

Blasphemous negligence of the chain rule when performing that first derivative to find its roots for the maximum estimation on the mean of the random variables in the second example. Good thing the difference is symmetric!

nicolasbourbaki
Автор

For the binomial case, why is it that we don't take a product for the likelihood function?

(Is it because there's only one observation?)

anangelsdiaries
Автор

thank you very much sir for this lesson you made everything to be so simple🙏🙏

enzokuhlemsotra
Автор

by watching the video, not clear to me why the optimal value turns out to maximize the likelihood function.

kevinxu
Автор

Can someone expand the formula to show how he got rid of exponential while taking log of exponential {x - u/2v}

oneaboveall
Автор

why is the variance under the root shouldn't it be outside of the root(2pi)

porterchien
Автор

9:09 howcome u can cancel out the 2 and a v? they r both denominators

jovialjoe_
Автор

why we can directly use the PMF of binomial to calculate the ML instead of using the ML function like the second example?

m.preacher
Автор

Hello thanks for the effort. I think you have a mistake when you minimized w.r.t v. the sum part of the denominator must be 4v^2

husseinsleiman
Автор

Yo this guy is awsome he sounds like Junior from Kim Possible

drakoumell
Автор

If you dislike this video, I am sorry you are just weak at advanced maths/ statistics. Not that MIT Professor’s fault, this isn’t for everybody.

yaweli