A Gradient-Based Split Criterion for Highly Accurate and Transparent Model Trees
Authors: Klaus Broelemann, Gjergji Kasneci
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we will evaluate the effectiveness of this novel method on multiple datasets. The experiments cover both classification and regression. |
| Researcher Affiliation | Industry | Klaus Broelemann and Gjergji Kasneci SCHUFA Holding AG, Wiesbaden, Germany |
| Pseudocode | Yes | Algorithm 1 Training Model Trees... Algorithm 2 Gradient-Based Split Finding |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code for the described methodology. |
| Open Datasets | Yes | All datasets come from public sources2 [Yeh and Lien, 2009; Zikeba et al., 2016] and cover both classification and regression tasks. The number of samples and attributes of these datasets are displayed in Tab. 3. |
| Dataset Splits | Yes | For our experiments, we performed 4-fold cross validation and averaged the 4 performance measurements. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for the experiments. |
| Software Dependencies | No | The paper does not specify version numbers for any software dependencies. |
| Experiment Setup | Yes | With transparency and shallow trees in mind, we restrict all model trees to a fixed depth of 1, 2, or 3. ... For our experiments, we performed 4-fold cross validation and averaged the 4 performance measurements. |