Lecture 21: Regression Trees

preview_player
Показать описание
I discuss Regression Trees. This is a non-parametric estimation method, where the predicted values are constant over "regions" of x_i. These regions are chosen in a clever way known as "recursive binary splitting", which is a greedy algorithm for finding the optimal way of partitioning the space of possible values of x_i.
Рекомендации по теме
Комментарии
Автор

You have literally saved my life, thank you

ginaho
Автор

Excellent presentation.Thanks for uploading.

nikhiltitus
Автор

clear explain- thank you. One question though - for section at 4:10, calculation for C50 5=2, 118, 760 ;this is like all the combination of selecting 5 objects out of the 50 objects; I think you are trying to calculate all the possible ways to distribute 50 objects to 5 different groups?

alen
Автор

Great explanation. Thank you. I still have a question. At 6:45, in the algorithm, you say : "Find optimal cutoff point, s". But I wonder how you do that ? How do you find s ? You iterate aver each sample and keep the best ? You cut on the sample or at the average between two samples ?

fifilulu
Автор

Thanks for the video! Could you tell us which book you used for the "Initial Idea" section. Thanks

investigacioneseconometric
Автор

7:20 so first iteration t1 is actually the mean of x1?

radsimu
Автор

Thank you
does it call M5-tree model ?

firassami
Автор

where's the regression tree? looks like it can do only classification

radsimu
Автор

I think your definition of classication tree is bad, Because Yi is qualitative variable, no discrete, that is not the same.

royromerog