Maximum Likelihood - Cramer Rao Lower Bound Intuition

preview_player
Показать описание
This video provides some intuition behind the idea that the Cramer-Rao Lower Bound is inversely related to the variance of a maximum likelihood estimator.

Рекомендации по теме
Комментарии
Автор

I think this small video worth few 2hrs lectures in a university.

Borey
Автор

THIS MAKES SO MUCH SENSE!! Thank you so much for explaining this more clearly in a few minutes than my textbook could do in a few hours!

AndrewCarlson
Автор

This explanation is excellent. It is crystal clear to explain why is the inverse relationship between variance and second derivative, and why is second derivation, and plus why it is negative! Bravo, Prof.Ben!

oscarlu
Автор

In 7m and 59s you explained it better and more clearly than many 2h university lectures combined.

satltabur
Автор

Studying for actuarial exams and the material just throws Fisher Information at you with no context. This will help me understand exactly what we are expected to do in the calculations. Thank you

andrewedson
Автор

This was my intuition when studying ML estimators in statistics, but never got a straight answer about it from my teachers. Happy to see others think of it through a geometric lens! Great video

accountname
Автор

Like everyone else said, very well explained. I feel way less jittery about this whole entire concept. Thank you in 2019!

jaymei
Автор

Really appreciate videos like this where the aim is to provide an intuitive explanation of the concepts as opposed to going into detail on the maths behind them. Thanks.

LongyZ
Автор

This video makes me very clear about one thing, that I find it strange how hard it obviously is for professors to provide some clear intuition. Why must it be so hard to be pedagogical when you really know something, which I expect a professor does. This is a working day of headache over horrible handouts made understandable in 5 mins.

johannaw
Автор

This was a fantastic intuitive explanation - thank you!

HappehLlama
Автор

Hi Mr. Lambert, I just want to take a moment to thank you for taking the time to make these videos on YouTube. They are very easy to understand and by watching your videos I have been able to understand my statistical theory and bayesian statistics courses more as an undergrad. Thanks a lot and I wish you all the best!

yukew
Автор

Beautifully explained my friend- intuition is almost always as important as the actual proof itself

ishaansingh
Автор

Damn. You explained this so well. I never have any idea what my professor is talking about, but videos like this help SO MUCH. Thank you!

GuppyPal
Автор

Wow! This clarifies a good week or two from last year's lectures. I wish I had seen these videos when I was taking the course last year.

irocmath
Автор

The point of view in curvature is soooo great!

Byc
Автор

this is the best video ive seen on this topic, very well done

tomthefall
Автор

wow finally get the idea about this relationship between covariance matrix and hessian

cecicheng
Автор

Thank you for this video. I have watched this video many times over the years. The simplicity, intuition, visuals, clarity, and ease, are nothing less than brilliant. It has always helped whenever things get fuzzy.

Just a small request or a question if you may: Calling vertical axis "likelihood of the data" makes it a bit confusing!

Instead, should it not be "likelihood of the parameter" that is L( theta; data). And this "likelihood of the parameter" then happens to be equivalent to f(data|theta)? So, y axis should not be called L(data|theta)?

aartisingh
Автор

This makes so much more sense now, thank you!

mehinabbasova
Автор

High curvature -> sharp -> concentrated -> low variance. Makes sense.

Trubripes