QUERY EFFICIENT DECISION BASED SPARSE ATTACKS AGAINST BLACK-BOX DEEP LEARNING MODELS
Authors: Viet Vo, Ehsan M Abbasnejad, Damith Ranasinghe
ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We develop an evolution-based algorithm Sparse Evo for the problem and evaluate it against both convolutional deep neural networks and vision transformers. (Abstract) and 4 EXPERIMENTS AND EVALUATIONS (Section title). |
| Researcher Affiliation | Academia | Viet Quoc Vo, Ehsan Abbasnejad, Damith C. Ranasinghe The University Of Adelaide {viet.vo,ehsan.abbasnejad,damith.ranasinghe}@adelaide.edu.au |
| Pseudocode | Yes | Algorithm 1: Sparse Evo (Section 3.2, page 4) and Algorithm 2: Initialise Population (Appendix A.1, page 10). |
| Open Source Code | No | The paper does not provide a direct link to a source-code repository nor explicitly state that the code for the described methodology is publicly available. |
| Open Datasets | Yes | For a comprehensive evaluation of the effectiveness of Sparse Evo, we employ two standard computer vision tasks with different dimensions: CIFAR10 (Krizhevsky et al.) and Image Net (Deng et al., 2009). (Section 4.1) |
| Dataset Splits | Yes | For the evaluation sets, we select a balanced sample set. We randomly draw 1,000 and 200 correctly classified test images from CIFAR10 and Image Net, respectively. (Section 4.1) and All of the parameter settings are summarized in Appendix A.2 (Section 4.1) and Table 1: Hyper-parameters setting in our experiments (Appendix A.2) |
| Hardware Specification | No | The paper does not specify the exact hardware components (e.g., specific GPU or CPU models, memory details) used for running the experiments. |
| Software Dependencies | No | The paper mentions 'Py Torch' in the context of pre-trained models but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | All of the parameter settings are summarized in Appendix A.2 (Section 4.1) and Table 1: Hyper-parameters setting in our experiments (Appendix A.2). |