Ensemble Method : Boosting ll Machine Learning Course Explained in Hindi

preview_player
Показать описание
Myself Shridhar Mankar an Engineer l YouTuber l Educational Blogger l Educator l Podcaster.
My Aim- To Make Engineering Students Life EASY.

Playlists :

• 5 Minutes Engineering Podcast :

• Aptitude :

• Machine Learning :

• Computer Graphics :

• C Language Tutorial for Beginners :

• R Tutorial for Beginners :

• Python Tutorial for Beginners :

• Embedded and Real Time Operating Systems (ERTOS) :

• Shridhar Live Talks :

• Welcome to 5 Minutes Engineering :

• Human Computer Interaction (HCI) :

• Computer Organization and Architecture :

• Deep Learning :

• Genetic Algorithm :

• Cloud Computing :

• Information and Cyber Security :

• Soft Computing and Optimization Algorithms :

• Compiler Design :

• Operating System :

• Hadoop :

• CUDA :

• Discrete Mathematics :

• Theory of Computation (TOC) :

• Data Analytics :

• Software Modeling and Design :

• Internet Of Things (IOT) :

• Database Management Systems (DBMS) :

• Computer Network (CN) :

• Software Engineering and Project Management :

• Design and Analysis of Algorithm :

• Data Mining and Warehouse :

• Mobile Communication :

• High Performance Computing :

• Artificial Intelligence and Robotics :
Рекомендации по теме
Комментарии
Автор

1. Bagging is a parallel learner process vs boosting is sequential.
2. Boosting is iterative vs bagging doesn't have to be.
3. Boosting can increase over-fitting whereas bagging generally decreases boosting.


Ensemble learning works well when different models make independent mistakes i.e. when different models make mistakes on different examples.

kitagrawal
Автор

You sir should write a book! I bet it'll be bestseller....my university professors couldn't explain me this way for a whole semester !

anur
Автор

0:35 Sir got no chill 😂😂 awesome confusion hi khatam kardiya... multiple-choice answer

iamvbj
Автор

Your videos on boosting and bagging is far more informative, clear and better than that of udacity. Thanks a lot sir and keep up the good work. Your videos are not only helping UG students for semester exams, but all other levels. I hold a masters in data analytics from one of the oldest IITs and i still clear my concepts with your videos for PG placements. Your way of teaching is amazing!

subhadeep
Автор

1. Initialise the dataset and assign equal weight to each of the data point.
2. Provide this as input to the model and identify the wrongly classified data points.
3. Increase the weight of the wrongly classified data points.
4.
if (got required results)
Goto step 5
else
Goto step 2

End

rugved
Автор

WTF, today mere lecturer ne 1 hour laga diya par smajh nai aaya.
yaar kes tarah tum itny achy se 10 minute ke andhar samjhaa dety ho, mujhe smajh nai aata . . .
You're great yar. Lot of love and respect from Pakistan.
As a class CR, Now this is my duty to share your videos with my class mates.

ShaidaMuhammad
Автор

Your sweat shows that you are really working so hard to deliver such amazing content!!

radhasingh
Автор

Sir please continue making such videos, we are watching your videos while studying in US universities because you explain so well

sam
Автор

Machine learning in Hindi.Best thing I have seen on YouTube today.True democratisation of knowledge.

rohanbhavale
Автор

thank you sir i have been watching boosting videos alot but after watching your its not required anymore excellent teacher

milliesadie
Автор

Thanks sir..
Please make videos of cloud computing..
And more videos on machine learning

priyakhedekar
Автор

har video kamal ka hota hai seriously, LOve you sir

sheetala_tiwari
Автор

didn't knew rohit sharma could teach so well

pushkarbansal
Автор

What a good lecture. You have explained very clearly. Thank you so much

dr.junaidslectures
Автор

Yes could you also in a similar way make a video on gradient decent, gradient boosting machine and xboost...

SARTHAKbhatnagar
Автор

Amazing Sir!! You made it so easy to understand!! Appreciate it!!

SARTHAKbhatnagar
Автор

Very nice video brother.
Please cover Adaptive Boosting, Gradient Boosting & XG Boosting.

bhavikdudhrejiya
Автор

Commenting to boost ur videos in algorithm

alipbhaumik
Автор

Bagging (Bootstrap Aggregating): In bagging, each base learner is trained on a different bootstrap sample (randomly selected with replacement) from the original dataset. This means that each learner sees a slightly different version of the dataset.

Boosting: In boosting, each base learner is trained on the entire dataset, but the weights on the data points are adjusted based on the performance of the previous models. This means that each learner focuses more on the data points that were misclassified by previous learners.

storyofstories
Автор

• Boosting is a powerful ensemble technique used in machine learning to improve the accuracy and performance of models.
- It combines multiple weak learners to create a strong learner.
They do this by iteratively training models, adjusting weights based on the performance of previous models, and combining their predictions to produce a final, robust prediction

emanrazzaq