Group-wise oracle-efficient algorithms for online multi-group learning

Authors: Samuel Deng, Jingwen Liu, Daniel J. Hsu

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This is a theory paper, where the main contributions are towards novel algorithmic design principles for a learning-theoretic model.
Researcher Affiliation Academia Samuel Deng Department of Computer Science Columbia University samdeng@cs.columbia.edu Daniel Hsu Department of Computer Science Columbia University djhsu@cs.columbia.edu Jingwen Liu Department of Computer Science Columbia University jingwenliu@cs.columbia.edu
Pseudocode Yes Algorithm 1: Algorithm for Group-wise Oracle Efficiency (for smoothed online learning)
Open Source Code No The paper does not include any explicit statement about releasing its own source code, nor does it provide a link to a code repository. As stated in the NeurIPS checklist, this is a theory paper.
Open Datasets No The paper mentions 'n i.i.d. training samples' in a theoretical context, but does not provide any concrete access information, citations, or links to a publicly available dataset. As stated in the NeurIPS checklist, this is a theory paper and does not involve empirical studies with specific datasets.
Dataset Splits No The paper is theoretical and does not describe any experimental procedures or dataset splits. No specific details about validation splits are provided.
Hardware Specification No The paper is theoretical and does not provide any specific hardware details used for running experiments.
Software Dependencies No The paper is theoretical and does not list specific software dependencies with version numbers required for replicating experiments.
Experiment Setup No The paper is theoretical and does not provide concrete hyperparameter values, training configurations, or system-level settings for an experimental setup.