Xgboost Regression In-Depth Intuition Explained- Machine Learning Algorithms 🔥🔥🔥🔥

preview_player
Показать описание
XGBoost is a decision-tree-based ensemble Machine Learning algorithm that uses a gradient boosting framework. In prediction problems involving unstructured data (images, text, etc.) artificial neural networks tend to outperform all other algorithms or frameworks.
All Playlist In My channel

Please donate if you want to support the channel through GPay UPID,

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

Please do subscribe my other channel too

Connect with me here:

#xgboostregression
#xgboost
Рекомендации по теме
Комментарии
Автор

Sir, I'm a huge fan of yours, although I know Xgboost for regression and after watching this, I can say how simple this is. You clearly explained each and every concept like Similarity weight, how to make a split & Gamma for pruning. Unlike other youtubers who've made this algorith complex, now I can suggest my collegues this video.

vishaldas
Автор

I feel xgboost too much complecated so i chose this vedio of krish naik sir because he make the things very simple so .. and now lets go .. Thank you sir very much !!!

pravinshende.DataScientist
Автор

similarity weight you have written fromula sigma (x square) while you are doing square ( sigma x)

avinashajmera
Автор

121/2 is 60.5 i know its not a big mistake but sometimes i take notes from your video and while revising after a month if the values are wrong i need to do the calculation again and it also creates doubt

shantanusingh
Автор

at 12:36 u have calculated gain which shd be 243.58 bt u calculated 143.48

anuragshrivastava
Автор

hey i am watching u from uzbekistan
and thank you u are explaining very simple thank you tooo much

music_vevo
Автор

really great. one of the best explination i've ever seen

jamalnuman
Автор

Greatvideo.Veryvery important to gain success in product based companies

sandipansarkar
Автор

I like this video as among alll videos i can understand your accent i hope you can redo the video

alishazel
Автор

Please make practical implementation....much needed and its gonna be amazing!

shubhammore
Автор

Hi Kris there is an issue with the similarity formula it should be " (sum for residuals) squared/ number of residuals+ lambda" you have written "sum( ( residuals) squared)/ number of residuals+ lambda".

vatsalkachhiya
Автор

How you are selecting < 2 and > 2 ? Please clarify

khubeb
Автор

Very clear and understandable explanation. Keep posting and keep growing.

ronylpatil
Автор

For the similarity weight of the root, the square will add up to 405. You just cancelled them all?

matx
Автор

Nicely explained .. keep uploading more n more videos .. @Krish Naik Sir

nishiraju
Автор

Krish, can you tell me about some references for gaining in depth theoretical knowledge about various machine learning and deep learning models?I am currently pursuing masters in Statistics, so a good chunk of them comes under my syllabus.But things like NLP, DL or xGboost, recommender systems etc is not included.Anyway, your videos are great to watch.

abhisek-chatterjee
Автор

The moment when your wrote 20/2=10 (instead of -10) as the gain of left branch, I realized what means "gradient exploding" :D:D:D Many thanks for these awsome tutorials !

chenqu
Автор

nice implementation explaination,
statquest + this tutorial is a very effective combination to grasp this concept

ashwinshetgaonkar
Автор

In the similarity weight computation, you're squaring the residual sum instead of summing the residual squares. Is that correct?

MuriloCamargosf
Автор

19:20 . How The Gamma - Value 150 .. is set ??? Who assign this ???

SidIndian