Universal Rates of Empirical Risk Minimization

Authors: Steve Hanneke, Mingyue Xu

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our main result is a fundamental tetrachotomy: there are only four possible universal learning rates by ERM, namely, the learning curves of any concept class learnable by ERM decay either at e n, 1/n, log (n)/n, or arbitrarily slow rates. Moreover, we provide a complete characterization of which concept classes fall into each of these categories, via new complexity structures. We also develop new combinatorial dimensions which supply sharp asymptotically-valid constant factors for these rates, whenever possible.
Researcher Affiliation Academia Steve Hanneke Department of Computer Science Purdue University steve.hanneke@gmail.com Mingyue Xu Department of Computer Science Purdue University xu1864@purdue.edu
Pseudocode No The paper does not contain any sections or figures explicitly labeled 'Pseudocode' or 'Algorithm', nor does it present structured algorithm blocks.
Open Source Code No The paper is theoretical in nature and does not mention releasing any source code for its methodology or provide links to a code repository.
Open Datasets No The paper is theoretical and does not involve experimental evaluation on datasets, so it does not discuss public dataset availability for training.
Dataset Splits No The paper is theoretical and does not involve experimental evaluation or dataset splitting, so it does not provide training/validation/test splits.
Hardware Specification No The paper is theoretical and does not report on experimental setups or hardware used for computation.
Software Dependencies No The paper is theoretical and does not report on experimental setups or software dependencies with specific version numbers.
Experiment Setup No The paper is theoretical and does not include experimental details such as hyperparameters or system-level training settings.