Are adversarial examples inevitable?
Authors: Ali Shafahi, W. Ronny Huang, Christoph Studer, Soheil Feizi, Tom Goldstein
ICLR 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, in Section 8, we explore the causes of adversarial susceptibility in real datasets, and the effect of dimensionality. We present an example image class for which there is no fundamental link between dimensionality and robustness, and argue that the data distribution, and not dimensionality, is the primary cause of adversarial susceptibility. |
| Researcher Affiliation | Academia | Anonymous authors Paper under double-blind review |
| Pseudocode | No | No pseudocode or algorithm blocks were found in the paper. |
| Open Source Code | No | The paper does not provide any specific links or explicit statements about the release of source code for the methodology described. |
| Open Datasets | Yes | One place where researchers have enjoyed success is at training classifiers on low-dimensional datasets like MNIST (Madry et al., 2017; Sinha et al., 2018). ... We can reduce Uc and dramatically increase susceptibility by choosing a more spread out dataset, like CIFAR-10, in which adjacent pixels are less strongly correlated and images appear to concentrate near complex, higher-dimensional manifolds. |
| Dataset Splits | Yes | Adversarial examples for MNIST/CIFAR-10 were produced as in Madry et al. (2017) using 100-step/20-step PGD. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory amounts) used for running experiments were provided in the paper. |
| Software Dependencies | No | No specific software dependencies with version numbers were mentioned in the paper. |
| Experiment Setup | Yes | Adversarial examples with different norm constraints formed via the projected gradient method (Madry et al., 2017) on Resnet50... Adversarial examples for MNIST/CIFAR-10 were produced as in Madry et al. (2017) using 100-step/20-step PGD. |