AI Simplified: SHAP Values in Machine Learning

preview_player
Показать описание

Stay connected with DataRobot!
Рекомендации по теме
Комментарии
Автор

Thank you Mark for a great and simple example of SHAP value!

I used SHAP as a tool for feature selection for my XGBoost decision tree model. After obtaining the feature importance, I noticed that the SHAP values of some features are equal to zero. After investigating into more details, I found that the features with zero SHAP value are collinear. For example, if features A and B are highly correlated, the SHAP value of B is then set to zero.

However, from my understand, Shapley value uses cooperative game theory to compute the contribution of each signal. So, in case of multicollinearity, shouldn't A and B have the same value of Shapley value? Also, if the Shapley value can handle multicollinearity, how does SHAP select the signal and ignore the other correlated features?

suratasvapoositkul
Автор

I can apply shap library and interpret the chart but what is final report out if it ??? Like what management / user expect from it ??? I can't see this chart to non-technical person . is there any report can be generated to draw any conclusion

pra
join shbcf.ru