AdaBoost Algorithm | Code from Scratch

preview_player
Показать описание
his video guides you through building the AdaBoost algorithm step by step in Python. Perfect for beginners looking to understand and implement AdaBoost on their own.

============================
Do you want to learn from me?
============================

📱 Grow with us:
Рекомендации по теме
Комментарии
Автор

CampusxBoost(consistency=True,
hard_wrork=100%,
quality = outstanding,
future_Ml = bright)
Salute to you sir ❤

Shisuiii
Автор

no words, just the best from scratch explanation, I have watched my professors explanation but always get's puzzled how they derived it. you just nailed the in-depth explanation at each step.

iyqifvi
Автор

Literally watching your 100th video in this series!
Your content is really informative and addictive!
The journey of learning from Day 1 to now has been full of excitement.
Thank you for this Wonderful contribution to the Data Science community!

abhasmalguri
Автор

Time stamp: 13:31
The training of the third model using the DecisionTreeClassifier contains an error. We should have used the third_df instead of the second_df. It is understandable; such mistakes can occur during code processing. Nevertheless, it is still a great implementation. Thank you.

thisisvishalpandey
Автор

100 videos done! Your clear explanations & passion make ML a blast! 🚀🤖🙌

sahilkayastha
Автор

There is a slight logical mistake in sir's method of creating a new dataset after upsampling. The correct way is to reset the index of the new dataset after creating it by upsampling. Because if we do not reset the index, then the next dataset will not contain the desired rows. You can check it too. So, each time after creating the new dataset, you need to run this code before processing the new dataset for the next decision stump -

new_df = new_df.reset_index(drop = True)

RahulRaj-nwrr
Автор

Why do we need to up-sample the dataset if we have already updated the weights of the correctly and incorrectly predicted samples. The incorrect ones would automatically have greater say in the subsequent samples, right? Also, if we up-sample the dataset and create new one in each iteration, few rows of data (correctly classified) will keep on disappearing from the dataset and the dataset will be left only with hand full of values copies of incorrectly classified samples. Wouldn't the diminishing representation of all the rows in each iteration not let the model understand the deeper trends/pattern in the data?

anaykhator
Автор

great sir,
I really enjoyed watching this series.

core
Автор

Hi Nitish, Thank you for your playlist on Machine Learning. Helping me alot as i am switching my career from Project Management to Data Science.

In this video, in 2nd df you said that row 0 is wrongly classified but it was row 2 as you said at 12:54. But at 13:22 you said 0th row which was wrongly classified. In the upsampling we did't get 2nd row we got only 0.

Need your clarification on this. Hope you will respond.

abhinavkhandelwal-ww
Автор

Hats off to you because i think you are the only one who has shown the algorithm working in a notebook from scratch.

AltafAnsari-tfnl
Автор

Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?

iamdSachin
Автор

Sir, please upload video on daily basis. As I am following your classes and some blogs to learn data science 🙏

kislaykrishna
Автор

while training the third model you have taken values from second model because of which you got too many error value in 3rd training, anyways video was great

singh
Автор

thank u very much sir really nice video

aditya_
Автор

But how can we extend this code for multiclass classification problem as sign test is used for only binary one?

meenatyagi
Автор

Are we going to use the updated weights in the second ? Ya phir whi model 1 k weights use kerney hai ?

mustafamarvat
Автор

i have a question why u he didnt created a video on adaboostRegressor?

rajgurubhosale
Автор

Hai sir...pls help one question sir.. I have a 1 input apply but my output max 10 ..it's possible sir ....

murumathi
Автор

sir can we use the same code for image dataset?

aditiarora
Автор

sir there is one doubt if error is more thn 0.5 then alpha is negative .so, it will start decreasing weights of wrong one and increasing weights of right one.

pradeeptamohanty
welcome to shbcf.ru