Maths behind XGBoost|XGBoost algorithm explained with Data Step by Step

preview_player
Показать описание
Maths behind XGBoost|XGBoost algorithm explained with Data Step by Step
#MathsBehindXGBoost #UnfoldDataScience

Hello,
My name is Aman and I am a data scientist.

About his video:
In this video, I explain the mathematics behind XGBoost algorithm. I explain in detail how XGBoost algorithm works by
taking a sample data and explaining step by step process. Below topics are specifically explained in this video:

1. Maths behind xgboost algorithm
2. How xgboost algorithm works
3. xgboost algorithm intution
4. Hyperparameters in xgboost algorithm
5. Boosting in xgboost algorithm
6. xgboost algorithm explained
7. xgboost machine learning

About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.

Join Facebook group :

Follow on twitter : @unfoldds

Follow on Instagram : unfolddatascience

Watch python for data science playlist here:

Watch statistics and mathematics playlist here :

Watch End to End Implementation of a simple machine learning model in Python here:

Learn Ensemble Model, Bagging and Boosting here:

Build Career in Data Science Playlist:

Artificial Neural Network and Deep Learning Playlist:

Natural langugae Processing playlist:

Understanding and building recommendation system:

Access all my codes here:

Рекомендации по теме
Комментарии
Автор

This channel has become one of my favorite platforms to learn ml, owing to the crisp explanation by Aman.

prateeksachdeva
Автор

At 10:27 I don't understand why the similarity score after the split is affected by the change of the lambda value before the split "Why the similarity score after the split will go down?". As I understood from the video, the split rule has nothing to do with the lambda value, therefore if lambda value changed, the split remains the same. the only thing changes is the gain, when the lambda value goes up, then the smilarity score before the split decreases and the gain increases because the deducted value (similarity score before the split) decreases when lambda value gets higher.

ahmedidris
Автор

listen to this video 3 times.. lot of insights. Thank you.

chdoculus
Автор

Looking at the content & no. Of subscribers. Highely underrated

nikhilpawar
Автор

Sir, can you please tell why you didnt square the SR in12:08.
And can you tell how the output at 14:02 is 6?
What does the output actually mean?

ranajaydas
Автор

12:01 why not square of sum of residuals as u said in the formula?

HrisavBhowmick
Автор

one of the best explanation on complex intuition of XG

animeshbagchi
Автор

Thanks a lot aman. Great video, Teaching is an art and you are doing justice to that every time by breaking down the concept to little steps and explaining it in a way it reaches everyone. keep up your good am expecting more videos in your NLP playlist

abiramimuthu
Автор

Nice one here. Thank you for the simplicity employed in explaining the core concepts.

oluwafemiolasupo
Автор

Hi,

excellent explanation but I have some points which are not clear to me yet.

1. how you choose the criteria to split the XGBoost tree by? for instance, you chose 'age<10' to split the tree by. How do we choose this criteria?

2. Why do we calculate the similarity score in such a way? What is the idea behind giving more score if the split of the residuals gathered the residuals with the same sign together?

3. I feel I still missing the 'magic' part, how do we explain this method? why using the previous prediction + LR*SS is a clever choice to go with? What is the common sense behind it?

Thanks a lot!

asafjerbi
Автор

Superb simple explanation, Thank you very much

maruthiprasad
Автор

Gr8 video Sir. You have explained it clearly and in a very simple way. Thanks a lot 🙏

santoshvjadhav
Автор

Sir, In formula new prediction=old prediction + Learning rate * output. I didn't understand how to get the output value as 6 for the second record. Could you explain once again.

sachin
Автор

Definitely best & most understandable explanation of XGB🔥

nikhilpawar
Автор

Very nicely explained, Thanks Sir. One of the best videos I have seen on YouTube.

ruchitagarg
Автор

Thanks a lot for this. Very helpful for me as I am brushing up on ML theory for interviewing. Awesome work!

killeraudiofile
Автор

very nice video.brief..concise..to the point..agree with others..probably the best explanation so far on youtube .way to go bro

samarkhan
Автор

Thanks a lot for this excellent video! I am still curious about how xgboost can achieve parallelization and how it handles missing values as you mentioned before. Looking forward to your new videos!

乔秦-vg
Автор

Hey good one again! Continue your good work.. Thanks

preranatiwary
Автор

Awesome indepth explanantion, keep up the good work man!

datadriven