首页|Interpretation with baseline shapley value for feature groups on tree models

Interpretation with baseline shapley value for feature groups on tree models

扫码查看
Tree models have made an impressive progress during the past years, while an important problem is to understand how these models predict, in particular for critical applications such as finance and medicine. For this issue, most previous works measured the importance of individual features. In this work, we consider the interpretation of feature groups, which is more effective to capture intrinsic structures and correlations of multiple features. We propose the Baseline Group Shapley value (short for BGShapvalue) to calculate the importance of a feature group for tree models. We further develop a polynomial algorithm, BGShapTree, to deal with the sum of exponential terms in the BGShapvalue. The basic idea is to decompose the BGShapvalue into leaves' weights and exploit the relationships between features and leaves. Based on this idea, we could greedily search salient feature groups with large BGShapvalues. Extensive experiments have validated the effectiveness of our approach, in comparison with state-of-the-art methods on the interpretation of tree models.

interpretabilityshapley valuerandom forestsdecision tree

Fan XU、Zhi-Jian ZHOU、Jie NI、Wei GAO

展开 >

National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China School of Artificial Intelligence, Nanjing University, Nanjing 210023, China

2025

Frontiers of computer science

Frontiers of computer science

SCI
ISSN:2095-2228
年,卷(期):2025.19(5)
  • 82