Efficiently Learning Adversarially Robust Halfspaces with Noise

Authors: Omar Montasser, Surbhi Goel, Ilias Diakonikolas, Nathan Srebro

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical The entire paper is focused on theoretical analysis, definitions, lemmas, theorems, and algorithmic descriptions without any experimental section, dataset evaluations, or performance tables. For example, Section 3 "The Realizable Setting" and Section 4 "Random Classification Noise" are purely mathematical analyses. No empirical results are presented.
Researcher Affiliation Academia 1Toyota Technological Institute at Chicago 2University of Texas at Austin 3University of Wisconsin-Madison.
Pseudocode No The paper describes algorithmic procedures (e.g., in Lemma 3.7, 4.3) using prose and mathematical equations but does not include structured pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not contain any statement about releasing source code or provide a link to a code repository for the methodology described.
Open Datasets No This paper is theoretical and does not describe or run experiments on any specific dataset; thus, there is no mention of a publicly available dataset for training.
Dataset Splits No This paper is theoretical and does not involve empirical experiments with datasets, and therefore no dataset split information for validation is provided.
Hardware Specification No This paper is theoretical and does not involve running computational experiments; therefore, no hardware specifications are mentioned.
Software Dependencies No This paper is theoretical and does not discuss implementation details or require specific software dependencies with version numbers for reproducibility.
Experiment Setup No This paper is theoretical and does not describe any empirical experiments; therefore, there is no experimental setup information, including hyperparameters or system-level training settings.