Stability Analysis and Generalization Bounds of Adversarial Training
Authors: Jiancong Xiao, Yanbo Fan, Ruoyu Sun, Jue Wang, Zhi-Quan Luo
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments of adversarial training on CIFAR10. Figure 1: Experiments of adversarial training on CIFAR10. Robust overfitting in the experiments on (a) CIFAR-10, (b) CIFAR-100, (c) SVHN and (d) Image Net. |
| Researcher Affiliation | Collaboration | 1The Chinese University of Hong Kong, Shenzhen; 2Tencent AI Lab; 3Shenzhen Research Institute of Big Data |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | 3https://github.com/Jiancong Xiao/Stability-of-Adversarial-Training |
| Open Datasets | Yes | We mainly consider the experiments on CIFAR-10 (Krizhevsky et al., 2009), CIFAR-100, and SVHN (Netzer et al., 2011). We also provide one experiment on Image Net (Deng et al., 2009). |
| Dataset Splits | No | The paper mentions using a validation set to determine when to stop ('For example, we can use a validation set to determine when to stop.') but does not provide specific details on its split from the main dataset. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running its experiments, only a general mention of 'GPU'. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers). |
| Experiment Setup | Yes | the step size in the inner maximization is set to be ϵ/4 on CIFAR-10 and CIFAR100 and is set to be ϵ/8 on SVHN. Weight decay is set to be 5 × 10−4. In Fig. 2 (a), (b), and (c), we show the experiments on the piece-wise learning rate schedule, which is 0.1 over the first 100 epochs, down to 0.01 over the following 50 epochs, and finally be 0.001 in the last 50 epochs, on CIFAR-10, CIFAR-100, and SVHN. |