How To Perform Post Pruning In Decision Tree? Prevent Overfitting- Data Science

preview_player
Показать описание
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

Please do subscribe my other channel too

Connect with me here:
Рекомендации по теме
Комментарии
Автор

Hello krish! I have just one simple question. Is it necessary to perform ccp if we are performing hyperparameter tuning and if we have to perform only one of both which will give us the best result?

ayushsrivastav
Автор

before even watching, I hit Like. thank you for all this effort ( respect from France)

mmarva
Автор

What is the math behind selecting perfect Alpha value in cost complexity pruning? Can you please make a video on that?

_curiosity...
Автор

I have a small question.
If we get over fitted model then getting ccp value is sufficient or we have to do hyperparameter with all values ?

ajaykushwaha-jemw
Автор

sir, by pruning the decision tree, it will not fit the entire train data, but still it is prefered as better than a un pruned one, why?

parveenparveen
Автор

Whether CCP is enough for Hyperparameter tuning or we have to also use the gridsearchCV for better accuracy. Please clarify

karthebans
Автор

Hello Krish - is it always necessary to use this technique, does is always proves to be accuracy enhancer?

sharmaramdhan
Автор

Thank you so much .This is exactly what I need to learn.Great explanations

fundatamdogan
Автор

Hello sir.. Does pruning also work for decision tree regressor?

omkarbabar
Автор

Can we create decision trees for xgboost or lightGBM models?

mansikumari
Автор

Is it good to do a hyperparametertuning first and then do cost complexity pruning.

arunmohan
Автор

please discuss Reduced Error Pruning, Pessimistic Pruning

adiflorense
Автор

Why this is related with post pruning where as pre-pruning works in similar way(with other parameters)?

doglibrary
Автор

The graph is slightly different when i run the same code. And the ccp_alphas has one extra value. Why would that be?
I am getting the result you have at @6:36 in cell no. 11. But it changes the value when you rerun it for me it remains same.

When i run the code, i am getting
Number of nodes in the last tree is: 1 with ccp_alpha: 0.3272984419327777

Any explaination?

ramendrachaudhary
Автор

In this case you have chosen random_state as 0 .. but in some cases if I change random_state value Accuracy also changes ... so my question is that in that particular case if I change random _state can we also see changes in alpha or max_depth ?

adityachakaraborty
Автор

What is the criteria for selecting the ccp_alpha value??

aakashyadav
Автор

Thanks for the video . Very informative .

aninsignificantman
Автор

I mean here your testing dataset had been used twice to get the best model. Maybe split a validation set is better?

finnzhang
Автор

accuracy_score for classification?? roc_auc is better option

ashwinshetgaonkar
Автор

sir can you made a video in which we need to convert the values of input feature using standardization and then deploy it using flask.

shivamkala