Random Forest Regression Introduction and Intuition

preview_player
Показать описание
Welcome to "The AI University".

About this video:
This video titled "Random Forest Regression Introduction and Intuition" explains the ensemble learning method, its various techniques like boosting and bagging. It also explains which approach random forest algorithm takes to develop the random forest regression model. Then later on it covers advantages and disadvantages associated with Random Forest Regression algorithm.

Subtitles available in: English

FOLLOW ME ON:

About this Channel:
The AI University is a channel which is on a mission to democratize the Artificial Intelligence, Big Data Hadoop and Cloud Computing education to the entire world. The aim of this channel is to impart the knowledge to the data science, data analysis, data engineering and cloud architecture aspirants as well as providing advanced knowledge to the ones who already possess some of this knowledge.

Please share, comment, like and subscribe if you liked this video. If you have any specific questions then you can comment on the comment section and I'll definitely try to get back to you.

*******Other AI, ML, Deep Learning, Augmented Reality related Video Series*****

******************************************************************

DISCLAIMER: This video and description may contain affiliate links, which means that if you click on one of the product links, I’ll receive a small commission.

#RandomForestRegression #EnsembleLearning #MachineLearning
Рекомендации по теме
Комментарии
Автор

Trivia question from the video: State True or False:
Decision trees are computationally expensive?

TheAIUniversity
Автор

Good video, thanks! However, I think you missed talking about bootstrapping procedure in the slide of "What is random forest regression?". A bootstrap sample is a random sample of the data taken with replacement. Meaning that after a sample is selected for inclusion in the subset, it’s still going to be available for further selection. Actually, on average, ≈ 63.21 % of the original samples end up in any particular bootstrap sample. Then, also randomly features/ predictor variables are selected to build the decision trees.

dr.cancan
Автор

Hello, are model tree ensemble and random forest the exact same thing or is random forest a type of model tree ensemble?

shawnstein
Автор

Yes decision Tree is computationally expensive

mcbhuva
Автор

random-forest Regression tho batya hi nahi

jayasinha