Adversarial Examples that Fool both Computer Vision and Time-Limited Humans

Authors: Gamaleldin Elsayed, Shreya Shankar, Brian Cheung, Nicolas Papernot, Alexey Kurakin, Ian Goodfellow, Jascha Sohl-Dickstein

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Here, we address this question by leveraging recent techniques that transfer adversarial examples from computer vision models with known parameters and architecture to other models with unknown parameters and architecture, and by matching the initial processing of the human visual system. We find that adversarial examples that strongly transfer across computer vision models influence the classifications made by time-limited human observers.
Researcher Affiliation Collaboration Gamaleldin F. Elsayed Google Brain gamaleldin.elsayed@gmail.com Shreya Shankar Stanford University Brian Cheung UC Berkeley Nicolas Papernot Pennsylvania State University Alexey Kurakin Google Brain Ian Goodfellow Google Brain Jascha Sohl-Dickstein Google Brain jaschasd@google.com
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statements about the release of source code or links to a code repository for the methodology described.
Open Datasets Yes In our experiment, we used images from Image Net [7].
Dataset Splits No The paper refers to 'training set' and 'test models' but does not explicitly detail dataset splits for training, validation, and testing with percentages or counts.
Hardware Specification No No specific hardware (GPU, CPU, or detailed compute cluster specifications) used for running the machine learning experiments or generating adversarial examples is mentioned.
Software Dependencies No The paper does not list specific software dependencies with version numbers (e.g., programming languages, libraries, frameworks).
Experiment Setup Yes The details of this retinal layer are described in Appendix B.