Robustness Guarantees for Adversarially Trained Neural Networks
Authors: Poorya Mianjy, Raman Arora
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We provide empirical evidence to support our theoretical results. |
| Researcher Affiliation | Academia | Poorya Mianjy Johns Hopkins University Raman Arora Johns Hopkins University arora@cs.jhu.edu |
| Pseudocode | Yes | Algorithm 1 Atk PGD Attack |
| Open Source Code | No | The paper does not provide any explicit statement or link indicating that the source code for their methodology is open-source or publicly available. |
| Open Datasets | Yes | We extract digits 0 and 1 from the MNIST dataset [Le Cun et al., 1998]... We use adversarial training with and without reflected loss (denoted by R-PGD and PGD, respectively) to train a Pre Act Res Net (PARN) He et al. [2016] on the CIFAR-10 dataset Krizhevsky et al. [2009]. |
| Dataset Splits | No | The dataset contains 12665 training samples and 2115 test samples. The paper specifies train and test sets but does not explicitly mention a validation set split. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running the experiments. |
| Software Dependencies | No | The paper mentions building on a 'Py Torch implementation' but does not specify any software versions for reproducibility. |
| Experiment Setup | Yes | The outer loop consists of 20 epochs over the training data with batch size equal to 64, randomly shuffled at the beginning of each epoch. The initial learning rate is set to 1, and is decayed by a multiplicative factor of 0.2 every 5 epochs. We use a SGD optimizer with a momentum parameter of 0.9 and weight decay parameter of 5e-4. We set the batch size to 128 and train each model for 20 epochs. We use a cyclic scheduler which increases the learning rate linearly from 0 to 0.2 within the first 10 epochs and then reduces it back to 0 in the remaining 10 epochs. |