Wasserstein Gradient Boosting: A Framework for Distribution-Valued Supervised Learning

Authors: Takuo Matsubara

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We empirically demonstrate the performance of the WGBoost algorithm through three experiments using real-world tabular data.
Researcher Affiliation Academia Takuo Matsubara The University of Edinburgh Edinburgh, EH9 3JZ takuo.matsubara@ed.ac.uk
Pseudocode Yes Algorithm 1: Wasserstein Gradient Boosting; Algorithm 2: Wasserstein-Boosted Evidential Learning
Open Source Code Yes The source code is available in https://github.com/takuomatsubara/WGBoost.
Open Datasets Yes The benchmark protocol uses real-world tabular datasets from the UCI machine learning repository [54]
Dataset Splits Yes where we held out 20% of the training set as a validation set and chose the number 1 M 4000 achieving the least validation error.
Hardware Specification Yes All the experiments were performed with x86-64 CPUs, where some of them were parallelised up to 10 CPUs and the rest uses 1 CPU.
Software Dependencies No The paper mentions the use of software like XGBoost and LightGBM and discusses various algorithms, but it does not specify exact version numbers for any software dependencies required to reproduce the experiments.
Experiment Setup Yes Common Hyperparameters Throughout, we set the number of output particles N to 10 and set each weak learner f to the decision tree regressor [50] with maximum depth 1 for Section 4.1 and 3 for the rest. We set the learning rate ν to 0.1 for regression and 0.4 for classification, unless otherwise stated.