XGBoost Part 2 (of 4): Classification

preview_player
Показать описание
In this video we pick up where we left off in part 1 and cover how XGBoost trees are built for Classification.

NOTE: This StatQuest assumes that you are already familiar with...

Also note, this StatQuest is based on the following sources:

For a complete index of all the StatQuest videos, check out:

If you'd like to support StatQuest, please consider...

Buying The StatQuest Illustrated Guide to Machine Learning!!!

...or...

...a cool StatQuest t-shirt or sweatshirt:

...buying one or two of my songs (or go large and get a whole album!)

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:

Corrections:
14:24 I meant to say "larger" instead of "lower.
18:48 In the original XGBoost documents they use the epsilon symbol to refer to the learning rate, but in the actual implementation, this is controlled via the "eta" parameter. So, I guess to be consistent with the original documentation, I made the same mistake! :)

#statquest #xgboost
Рекомендации по теме
Комментарии
Автор

Corrections:
14:24 I meant to say "larger" instead of "lower.
18:48 In the original XGBoost documents they use the epsilon symbol to refer to the learning rate, but in the actual implementation, this is controlled via the "eta" parameter. So, I guess to be consistent with the original documentation, I made the same mistake! :)

statquest
Автор

How do I pass any interviews without these videos? I don't know how much I owe you Josh!

TY-iltf
Автор

From Vietnam, and hats off to your talent in explaining complicated things in a way that I feel so comfortable to continue watching.

Автор

When we use fit(X_train, y_train) and predict(X_test) without watching Josh's videos or studying the underline concepts, nothing happens even if we get good results.
Thank you Josh for simplifying these hard pieces of stuff for us and creating these perfect numerical examples. Please keep up this great work.

alihaghighat
Автор

I finished with this video all the list, I am from Colombia and is hard to pay for learn about this concepts, so I am very gratful for your videos, and now my mom hates me when I say Double Bamm for nothing!! jajaja

manuelagranda
Автор

Josh, On a scale of 5 you are a level 5 Teacher. I have learned so much from your videos. I owe so much to Andrew Ng and You. I will contribute to Patreon Once I get a Job. Thank you

shaelanderchauhan
Автор

Josh! You made a machine learning a beautiful subject and finally I m in love with these Super BAM videos.

prathamsinghal
Автор

as a beginner of data science, I am super grateful for all of your tutorials. Helps a lot!

wongkitlongmarcus
Автор

Thank you Josh! You literally broke everything into little detail... Missed to meet you this time in India!

ramyasreddy
Автор

All the boosting and bagging algorithms are complicated algorithms. In universities, I have hardly seen any professor who can make these algorithms understand like Joshua does. Hats off man !!

saptarshisanyal
Автор

You are a nice guy, absolutely! I can't wait for the part 3.Although I have been learned XGBoost from the original paper, I can still get more interesting things from your video.Thank you :D

wucaptian
Автор

I have come across all the videos from gradient boosting till now, you clearly explain each and every step. Thanks for sharing the information with all. It helps a lot of people.

yukeshdatascientist
Автор

Thank you so much for making Machine Learning this easy for us . Grateful for your content . Love from India

chelseagrinist
Автор

Thanks Josh. You're a life saver and have made my Data Science transition a BAM experience. Thank You!

joshisaiah
Автор

Yo fr these are the best data science/ML explanatory vids on the web. Great work, Josh!

seanmcalevey
Автор

I must have watched almost every video at least three times during this pandemic. Thank you so much for your effort!

changning
Автор

thanks buddy, its hard for me to know how xgboost works in classification, but this tutorial has explained well

furqonarozikin
Автор

Bravo! Thanks for making life easy. Thanks and appreciation from Qatar.

hassaang
Автор

Thanks for boosting my confidence in understanding. There was this recent Kaggle tutorial that said LightGBM model "usually" does better performance than xgboost, but it didn't provide any context! I remember that xgboost was used as a gold standard-ish about 2-3 years ago(even CERN uses it if I'm not mistaken). Anyhoo, I hope I can keep up with all of this. I need to turn my boosters on.

lambdamax
Автор

Million Thanks Josh. I can not wait to watch other videos about XGBoost, lightBoost, CatBoost and deep learning. Your videos are the best.

amalsakr
visit shbcf.ru