Multi-group Learning for Hierarchical Groups

Authors: Samuel Deng, Daniel Hsu

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We then conduct an empirical evaluation of our algorithm and find that it achieves attractive generalization properties on real datasets with hierarchical group structure.
Researcher Affiliation Academia 1Department of Computer Science, Columbia University. Correspondence to: Samuel Deng <samdeng@cs.columbia.edu>, Daniel Hsu <djhsu@cs.columbia.edu>.
Pseudocode Yes Algorithm 1 MGL-Tree
Open Source Code No The paper refers to a third-party 'open-source xgboost implementation' but does not state that the code for their own methodology is open-source or provide a link to it.
Open Datasets Yes We conduct our experiments on twelve U.S. Census datasets from the Folktables package of Ding et al. (2021).
Dataset Splits No The paper mentions using 'a held-out test set of 20% of the data' but does not specify a train/validation/test split or cross-validation details for reproduction.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory, or cloud instance types) used for running experiments.
Software Dependencies No The paper mentions using 'scikit-learn' and 'xgboost' implementations but does not specify their version numbers.
Experiment Setup Yes Model Hyperparameters Logistic Regression loss = log loss, dual=False, solver=lbfgs Decision Tree criterion = log loss, max depth = {2, 4, 8} Random Forest criterion = log loss XGBoost objective = binary:logistic