Interventional SHAP Values and Interaction Values for Piecewise Linear Regression Trees
Authors: Artjom Zern, Klaus Broelemann, Gjergji Kasneci
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we conduct a runtime evaluation of the presented algorithms using various datasets and instances of (piecewise linear) gradient boosting trees. |
| Researcher Affiliation | Collaboration | 1SCHUFA Holding AG, Germany 2University of Tuebingen, Germany artjom.zern@schufa.de, klaus.broelemann@schufa.de, gjergji.kasneci@uni-tuebingen.de |
| Pseudocode | Yes | Algorithm 1: Interventional SHAP |
| Open Source Code | Yes | 2The source code for the presented algorithms is available at https://github.com/schufa-innovationlab/pltreeshap |
| Open Datasets | Yes | For this, we use the numerical regression datasets provided by Grinsztajn, Oyallon, and Varoquaux (2022, Sections A.1.2). These datasets are tabular data, which were preprocessed for neural networks and are publicly available on Open ML.3 |
| Dataset Splits | No | The paper mentions '75% of the data for the training and as background dataset' and 'remaining 25% of the data' for SHAP computation, indicating a train/test split, but it does not specify a separate validation split. |
| Hardware Specification | Yes | All experiments were run on a dedicated workstation with the following parameters: Intel Core i7-8700K CPU @ 3.7 GHz, 64 GB main memory. |
| Software Dependencies | No | The paper mentions using 'Light GBM' but does not provide specific version numbers for it or any other software dependencies. |
| Experiment Setup | Yes | For each evaluated dataset, we select the parameters of the model by a grid search with n estimators {20, 50, 100, 250, 500, 1000} and max depth {2, 4, 6, 8, 10}. |