EKF-SLAM (Cyrill Stachniss)

preview_player
Показать описание
EKF-SLAM: Landmark-based SLAM using the Extended Klaman Filter
Cyrill Stachniss, Fall2020
Рекомендации по теме
Комментарии
Автор

Amazing Thank you so much for great videos !!

toannn
Автор

Wow...I am watching this just before my exam in mobile robotics and I can tell you this is the best explanation I have seen in my job

uchennahenry
Автор

thank you so much. Your lecture really help me a

exvnisw
Автор

thanks, this give very details on the directions for implementing a ekf state estimator.

bpsseng
Автор

Why do we not transform the motion noise R by multiplying it from both sides with the jacobian of the motion model wrt the controls? We transfomormed the covariance using the jacobian of the motion model wrt the robots pose, should the same logic apply to the motion noise?

MrVaginalpilz
Автор

How would I make the F matrix for high dimensional space at 43:54 given the case for j=0?

johannstrydom
Автор

The part about the EKF Slam limit i found confusing. Why does the covariance of the estimate of the robot pose not decrease below its initial value? Isn't the observation of landmarks supposed to aid in determining where the robot is.

jadtawil
Автор

I don't understnad the special effect of loop closure events mathematically. Why does a loop closure reduce the SIgma?

*LOOP CLOSURE*
in EKF_CORRECTION for loop closure, line 9 is false. Landmark has already been seen.

delta(line 12) will be computed from *estimated* robot pose and *estimated* landmark position. It will not be particularly big.

line 14: *^z_i*, the *estimated z* for point i will be VASTLY different from *z_i* (*measured z* for point I) This is the special thing about loop closure.
This leads to a big update for the *mu* in line 18.

But why does it affect the covariance Sigma_t? Big differences between ^z_i and z_i somehow have to affect Sigma in a reducing way. Either due to a big H_i, or a big kalman gain (caused by H_i as well)

But the differences between ^z_i and z_i are not reflected in the H_i.

So what is the special thing about loop closure events for the covariance Matrix Sigma then?

Infaviored
Автор

You are thinner than seven years ago and fit :)!

yiminghu
Автор

Hi Cyrill, I noticed in some of your videos you gloss over details, I personally mind them being explained more deeply. For example, at 41:30 you say 'this is basically a squared function'. It is not clear to me how you got the derivative for this function.

nicolasperez