GradTree: Learning Axis-Aligned Decision Trees with Gradient Descent

Authors: Sascha Marton, Stefan Lüdtke, Christian Bartelt, Heiner Stuckenschmidt

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our approach outperforms existing methods on binary classification benchmarks and achieves competitive results for multi-class tasks. The implementation is available under: https://github.com/s-marton/Grad Tree. We empirically evaluate Grad Tree on a large number of real-world datasets (Section 4).
Researcher Affiliation Academia 1University of Mannheim, Germany 2University of Rostock, Germany {sascha.marton, christian.bartelt, heiner.stuckenschmidt}@uni-mannheim.de, stefan.luedtke@uni-rostock.de
Pseudocode Yes Algorithm 1: Tree Pass Function
Open Source Code Yes The implementation is available under: https://github.com/s-marton/Grad Tree
Open Datasets Yes The experiments were conducted on several benchmark datasets, mainly from the UCI repository (Dua and Graff 2017).
Dataset Splits Yes We used a 80%/20% train-test split for all datasets. For Grad Tree and DNDT, we used 20% of the training data as validation data for early stopping.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments.
Software Dependencies No The paper mentions 'sklearn (Pedregosa et al. 2011) implementation' but does not provide specific version numbers for any software dependencies.
Experiment Setup No The paper mentions that 'The complete list of relevant hyperparameters for each approach along with additional details on the selection are in the supplementary material.' and discusses general techniques like 'Adam optimizer' and 'early stopping', but does not provide specific numerical values for hyperparameters or detailed system-level training settings in the main text.