Passer à la navigation principale Passer à la recherche Passer au contenu principal

Linear TreeShap

  • Medical School of UESTC
  • Telecom Paris
  • Shopify
  • University of Waikato

Résultats de recherche: Le chapitre dans un livre, un rapport, une anthologie ou une collectionContribution à une conférenceRevue par des pairs

Résumé

Decision trees are well-known due to their ease of interpretability. To improve accuracy, we need to grow deep trees or ensembles of trees. These are hard to interpret, offsetting their original benefits. Shapley values have recently become a popular way to explain the predictions of tree-based machine learning models. It provides a linear weighting to features independent of the tree structure. The rise in popularity is mainly due to TreeShap, which solves a general exponential complexity problem in polynomial time. Following extensive adoption in the industry, more efficient algorithms are required. This paper presents a more efficient and straightforward algorithm: Linear TreeShap. Like TreeShap, Linear TreeShap is exact and requires the same amount of memory.

langue originaleAnglais
titreAdvances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022
rédacteurs en chefS. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, A. Oh
EditeurNeural information processing systems foundation
ISBN (Electronique)9781713871088
étatPublié - 1 janv. 2022
Evénement36th Conference on Neural Information Processing Systems, NeurIPS 2022 - New Orleans, États-Unis
Durée: 28 nov. 20229 déc. 2022

Série de publications

NomAdvances in Neural Information Processing Systems
Volume35
ISSN (imprimé)1049-5258

Une conférence

Une conférence36th Conference on Neural Information Processing Systems, NeurIPS 2022
Pays/TerritoireÉtats-Unis
La villeNew Orleans
période28/11/229/12/22

Empreinte digitale

Examiner les sujets de recherche de « Linear TreeShap ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation