Provable Robustness of Adversarial Training for Learning Halfspaces with Noise
Authors: Difan Zou, Spencer Frei, Quanquan Gu
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We analyze the properties of adversarial training for learning adversarially robust halfspaces in the presence of agnostic label noise. ... To the best of our knowledge, this is the first work to show that adversarial training provably yields robust classifiers in the presence of noise. ... In this work, we show that adversarial training provably leads to halfspaces that are approximate minimizers for the population-level robust classification error. ... Algorithm 1 Adversarial Training ... Theorem 2.8. ... Theorem 3.2. |
| Researcher Affiliation | Academia | 1Department of Computer Science, UCLA 2Department of Statistics, UCLA. Correspondence to: Quanquan Gu <qgu@cs.ucla.edu>. |
| Pseudocode | Yes | Algorithm 1 Adversarial Training ... Algorithm 2 Projected Stochastic Adversarial Training (PSAT(p, r)) |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | No | The paper is theoretical and analyzes properties of algorithms with respect to theoretical data distributions (e.g., 'distribution D', 'log-concave isotropic distributions'). It does not use or provide access information for any specific real-world or benchmark datasets for training. |
| Dataset Splits | No | The paper is theoretical and does not conduct experiments with dataset splits. Therefore, it does not provide training/validation/test dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not report on empirical experiments. Therefore, it does not mention any hardware specifications used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not report on empirical experiments. Therefore, it does not list any specific software dependencies with version numbers needed for replication. |
| Experiment Setup | No | The paper is theoretical and does not conduct empirical experiments. Therefore, it does not provide details about experimental setup, hyperparameters, or training configurations. |