Decision Trees and Boosting, XGBoost | Two Minute Papers #55

preview_player
Показать описание
A decision tree is a great tool to help making good decisions from a huge bunch of data. In this episode, we talk about boosting, a technique to combine a lot of weak decision trees into a strong learning algorithm.

Please note that gradient boosting is a broad concept and this is only one possible application of it!

__________________________________

Our Patreon page is available here:
If you don't want to spend a dime or you can't afford it, it's completely okay, I'm very happy to have you around! And please, stay with us and let's continue our journey of science together!

The paper "Experiments with a new boosting algorithm" is available here:

Another great introduction to tree boosting:

WE WOULD LIKE TO THANK OUR GENEROUS SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE:
Sunil Kim, Vinay S.

Károly Zsolnai-Fehér's links:
Рекомендации по теме
Комментарии
Автор

Thanks for taking on the XGBoost algorithm, Károly. I was quite excited to see you try to explain the XGBoost algorithm, because I (a machine learning researcher) have had difficulty understanding their methods. However, I have an issue with this video. You really only described Random Forests at a high level in this video, and didn't touch on gradient boosting and much less extreme gradient boosting (XGBoost). I fear that without that disclaimer, our fellow scholars may be misled into thinking that XGBoost works like Random Forests, which is certainly not the case. I suggest that you rework this video so you don't mislead our fellow scholars.

Rhiever
Автор

I've been watching your channel for about a year, but it's crazy that you did videos for such a long time! respect

thvist
Автор

Where is part 2? The video does not at all cover boosting. This video should be labeled "Decision trees"

christianbach
Автор

thanks for explaining this stuff in 2 min. I lose interest easily, but the quickness with all the visuals really help, thanks

aonoymousandy
Автор

You did not explain the XGBoost at all. How is this a "Two Minute Papers" episode?

pinardemetci
Автор

Had been searching for an intuitive explanation, thanks a ton!

fitha
Автор

What is the difference between bagging, boosting, voting and stacking? Random Forest and XGB can be related with which method

datascientist
Автор

What's the difference between XGBoost and Random Forest(s)?

arkrou
Автор

Wonderful tl;dr explanation. Thank you

cooltoonist
Автор

Is knowing the math behind algorithm must or just knowing that how algorithms works is enough? please please please give a reply.

rafsunahmad
Автор

It seems that a lot of concepts are combined together, decision tree, neural network, gradient descent, I'm a bit confused.

tolstoievski
Автор

Good video quality.
XGBoost is just the implementation of Boosting?

kswill
Автор

Couldn't even finish the video since I find it quite sexist...

josefinaalconada