Understanding Backdoor Attacks through the Adaptability Hypothesis
Authors: Xun Xian, Ganghua Wang, Jayanth Srinivasa, Ashish Kundu, Xuan Bi, Mingyi Hong, Jie Ding
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on benchmark image datasets and state-of-the-art backdoor attacks for deep neural networks are conducted to corroborate the hypothesis. Theoretical analyses of backdoor attacks under classical machine learning context. |
| Researcher Affiliation | Collaboration | 1Department of ECE, University of Minnesota 2School of Statistics, University of Minnesota 3Cisco Research 4Carlson School of Management, University of Minnesota. |
| Pseudocode | Yes | G. Pseudo-code for visualisation algorithms and additional experimental results. Algorithm 1 Visualizing high-dimensional data |
| Open Source Code | No | The paper does not contain any explicit statement about open-sourcing its code or a link to a code repository. |
| Open Datasets | Yes | We use 3 popular datasets: MNIST (Le Cun et al., 2010), CIFAR10 (Krizhevsky et al., 2009), and GTSRB (Stallkamp et al., 2012). |
| Dataset Splits | No | The paper mentions training and test sets but does not provide specific details about a validation split, explicit percentages for data partitioning, or cross-validation setup. |
| Hardware Specification | Yes | All of our experiments are conducted on a workstation with one A100 GPU. |
| Software Dependencies | No | The paper mentions machine learning models (Le Net, Res Net, VGG) and optimization algorithms (SGD) but does not specify software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | For Res Net and VGG models, we adopt the standard training pipeline of SGD with a momentum of 0.9, a weight decay of 10-4, and a batch size of 128 for optimization. For Let Net, we adopt the standard training pipeline of SGD with the initial learning rate of 0.1/0.01. |