771: Gradient Boosting: XGBoost, LightGBM and CatBoost — with Kirill Eremenko

preview_player
Показать описание
#GradientBoosting #XGBoost #LightGBM #CatBoost

Kirill Eremenko joins @JonKrohnLearns for another exclusive, in-depth teaser for a new course just released on the SuperDataScience platform, “Machine Learning Level 2”. Kirill walks listeners through why decision trees and random forests are fruitful for businesses, and he offers hands-on walkthroughs for the three leading gradient-boosting algorithms today: XGBoost, LightGBM, and CatBoost.

In this episode you will learn:
• [00:00:00] Introduction
• [00:07:58] All about decision trees
• [00:20:33] All about ensemble models
• [00:37:17] All about AdaBoost
• [00:45:21] All about gradient boosting
• [00:59:56] Gradient boosting for classification problems
• [01:04:09] Advantages of XGBoost
• [01:18:00] LightGBM
• [01:33:27] CatBoost

Рекомендации по теме
Комментарии
Автор

Great episodes guys! I started my journey in data with Kirill's Udemy courses in Power BI and Python! He is truly a great educator.

mabenba
Автор

Currently taking the course, ML level 2!

Moses_
Автор

Loved it, thank you, both!❤️

Manuel

ddddsdsdsd
Автор

hi @1:26:52 for lightgbm you mentioned about lightgbm and the value behind sparse columns. then can we convert sparse columns to categorical columns

milleniumsalman
Автор

Great video thank you! I have one question, for CatBoost, when it is a classification problem, how do you replace the categorical columns words like France, Germany, etc.? Using your example, do you look at the first 50 rows for Germany and then choose the most common (majority) classification in the target column to replace the word Germany?

fouried