Machine Learning Tutorial Python - 21: Ensemble Learning - Bagging

preview_player
Показать описание
Ensemble learning is all about using multiple models to combine their prediction power to get better predictions that has low variance. Bagging and boosting are two popular techniques that allows us to tackle high variance issue. In this video we will learn about bagging with simple visual demonstration. We will also right python code in sklearn to use BaggingClassifier. And oh yes, in the end we have the exercise for you, as always!

⭐️ Timestamps ⭐️

00:00 Theory
08:01 Coding
22:25 Exercise

#️⃣ Social Media #️⃣

❗❗ DISCLAIMER: All opinions expressed in this video are of my own and not that of my employers'.
Рекомендации по теме
Комментарии
Автор

This exercise was a challenge. Thank you. By just taking pure z of the set, some outliers were missed. Basically, all the outliers were the 0s for blood pressure and cholesterol. With those eliminated, I got significantly higher scores than the solution. All bagged models gave a similar 86% accuracy. The biggest jump from non-bagged model to bagged model was the Decision Tree which went from 79% accuracy without bagging to 86% with bagging. Also, I did the exercise several months after this video (was posted - not sure when it was made), so the libraries (especially SVC) may have improved (in their defaults).

paulkornreich
Автор

so far this is the best explanation on bagging technique I found on Youtube! Great Work

justin.c
Автор

Thank you so much sir for this ML playlist. Your explanations are simple, exact, and extremely easy to follow. The method that you use of first familiarizing us with theory, then with a practical example and then an exercise is really effective. Looking forward to more of such videos in your ML series. Thanks once again, sir.

ritvijmishra
Автор

This channel is golden, i really like how you explain the concepts until execution to practical coding.

SinarJourney
Автор

In this exercise, when used bagging classifier with svm, there isn't any changes at all. But when we use it with Decision tree classifier, it increased score from 0.821 to 0.86(82% to 86%).

anitoons
Автор

this was nice and straightforward, and the quip about "copy and paste" was hilarious

jasonwang-wgwu
Автор

One of the most underrated playlists for ML . I wish lots of student will join ❤

siddheshmhatre
Автор

Your tutorial series are teaching me a lot Sir. These are such well organized. You have made these so easier to learn and understand. Hats off to your hard work. A blind follower of you, Sir. Loads of love. <3

khanshian
Автор

My results of the exercise: svm standalone 0.8, after bagging 0.8, Decision Tree standalone 0.65, after bagging 0.79. Bagging helps improve accuracy and reduce overfitting, especially in models that have high variance. Mostly used for unstable models like Decision Trees

Koome
Автор

Hi Dhaval! Simple & useful explanation as always. Keep doing more videos.
However, @11:40 I believe we have to first do train test split & then we should perform standard scaling operation instead of doing the other way. Aren't we running into the problem of data leakage if we do standard scaling on all the data points without train & test split? Let me know your thoughts. Thanks!

upendrar
Автор

Thumbs up! You cannot learn swimming by seeing.
Who takes the pain of providing excercise. When i was trying to learn in the beginning this was what i wanted. But atleast good some1 is providing it now.

rahulranjan
Автор

Using standalone model I got a better score than if I used SVC with bagging: 0.902 versus 0.851
Using standalone decision tree I got 0.782 versus 0.84 with bagging.

Bagging helps reduce the overfitting (high variance) caused by decision trees by averaging multiple decision trees.

slainiae
Автор

Explanation is very easy...well understood....👍👍👍

SamjhoPaiseKo
Автор

That was clearly describe what is the bagging method, I wish you had a video about Boosting as well

elahe
Автор

why are we fitting our model on X, y
then what is the use of x_train and y_train and no use of scaling also if we are trainning our model on original X and y ?

ankitjhajhria
Автор

Thank you so much Sir for teaching us a lot of things. I was searching for here and there for Ensemble learning and you video just showed up. You are life saver. Thanks a lot!!

malikhamza
Автор

Thank you for this wonderful explanation. I have a query here. We scaled X but everywhere we use X in cross_val_score. Could you please explain why we scaled X?

vikranttripathi
Автор

Thanks a lot for providing very necessary and important contents!

shafinhossain
Автор

You have to do outlier detection because the max is much higher than that of 75% value

beakaiwalyakhairnar
Автор

I hope you see my questions you never response to my questions. why you didi not fit "BaggingClassifier' with '(x_train, y_train)', in exercise?

nastaran
welcome to shbcf.ru