Efficient Algorithms for Empirical Group Distributionally Robust Optimization and Beyond

Authors: Dingzhi Yu, Yunuo Cai, Wei Jiang, Lijun Zhang

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we conduct numerical experiments on empirical GDRO and empirical MERO to evaluate the performance of our algorithms.
Researcher Affiliation Academia 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China 2School of Data Science, Fudan University, Shanghai, China 3Pazhou Laboratory (Huangpu), Guangzhou, China.
Pseudocode Yes Algorithm 1 Variance-Reduced Stochastic Mirror Prox Algorithm for Empirical GDRO (ALEG) ... Algorithm 2 Two-Stage Algorithm for Empirical MERO (ALEM)
Open Source Code No The paper does not provide any statements about releasing open-source code for the described methodology or a link to a code repository.
Open Datasets Yes For the real-world dataset, we use CIFAR-100 (Krizhevsky et al., 2009)
Dataset Splits No The paper mentions '500 training images and 100 testing images for each class' for CIFAR-100 but does not specify a separate validation split or dataset.
Hardware Specification No The paper does not specify any hardware details such as GPU/CPU models, memory, or cloud computing resources used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies or their version numbers (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes Under conditions in Theorem 4.4, by setting K = Θ( n), the computation complexity for Algorithm 1 to reach ε-accuracy of (3) is O m ε .