Neural Architecture Search for Wide Spectrum Adversarial Robustness
Authors: Zhi Cheng, Yanxi Li, Minjing Dong, Xiu Su, Shan You, Chang Xu
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on benchmark datasets such as CIFAR and Image Net demonstrate that with a significantly richer search signal in robustness, our method can find architectures with improved overall robustness while having a limited impact on natural accuracy and around 40% reduction in search time compared with the naive approach of searching. |
| Researcher Affiliation | Collaboration | Zhi Cheng1*, Yanxi Li1, Minjing Dong1, Xiu Su1, Shan You2, Chang Xu1* 1School of Computer Science, Faculty of Engineering, The University of Sydney 2Sense Time Research {zche6824, yali0722, mdon0736, xisu5992}@uni.sydney.edu.au, youshan@sensetime.com, c.xu@sydney.edu.au |
| Pseudocode | Yes | Algorithm 1: Wsr-NAS algorithm |
| Open Source Code | Yes | Codes available at: https://github.com/zhicheng2T0/Wsr-NAS.git |
| Open Datasets | Yes | Extensive experiments on benchmark datasets such as CIFAR and Image Net demonstrate that with a significantly richer search signal in robustness, our method can find architectures with improved overall robustness while having a limited impact on natural accuracy and around 40% reduction in search time compared with the naive approach of searching. |
| Dataset Splits | Yes | We let dataset Dt and Dv to be different halves of the CIFAR-10 training set, let Ma be 600, let Ka be 1 and let B in the Wsr-NAS algorithm to be 10. |
| Hardware Specification | No | The paper mentions 'total GPU time' and refers to 'National Computational Infrastructure (NCI)' and 'Sydney Informatics Hub HPC Allocation Scheme' but does not specify any particular GPU models, CPU types, or other detailed hardware specifications. |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies such as programming languages, libraries (e.g., PyTorch, TensorFlow), or other tools used. |
| Experiment Setup | Yes | We let dataset Dt and Dv to be different halves of the CIFAR-10 training set, let Ma be 600, let Ka be 1 and let B in the Wsr-NAS algorithm to be 10. To find Wsr Net using a moderate number or a large number of adversarial noise strengths, we search for Wsr Net-Basic and Wsr Net Plus by setting (N1, N2) to be (3,3) and (3,8) respectively. α and β are set as 0.8 and 0.2 by default. Within all experiments, the step size of the PGD attack would be set to ϵi 2.5/20 at adversarial noise strength ϵi as in (Madry et al. 2019). |