Towards Reliable and Efficient Backdoor Trigger Inversion via Decoupling Benign Features
Authors: Xiong Xu, Kunzhe Huang, Yiming Li, Zhan Qin, Kui Ren
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on benchmark datasets demonstrate that our defenses can reach state-of-the-art performances. |
| Researcher Affiliation | Collaboration | Xiong Xu1, , Kunzhe Huang3, , Yiming Li1,2, , , Zhan Qin1, , Kui Ren1 1The State Key Laboratory of Blockchain and Data Security, Zhejiang University 2ZJU-Hangzhou Global Scientific and Technological Innovation Center 3Alibaba Cloud |
| Pseudocode | Yes | Algorithm 1 The algorithm of our BTI-DBF (U). |
| Open Source Code | Yes | Our codes are available at https://github.com/xuxiong0214/BTIDBF. |
| Open Datasets | Yes | We conduct experiments on three benchmark datasets, including CIFAR-10 (Krizhevsky, 2009), GTSRB (Houben et al., 2013), and (a subset of) Image Net (Deng et al., 2009). |
| Dataset Splits | No | The paper specifies the number of training and testing samples for each dataset (e.g., 'CIFAR-10... consists of 50,000 training samples and 10,000 testing samples') but does not explicitly provide details for a separate validation split used during model training. |
| Hardware Specification | Yes | In this paper, we run all experiments on a single RTX 3090 Ti GPU with Py Torch. |
| Software Dependencies | No | In this paper, we run all experiments on a single RTX 3090 Ti GPU with Py Torch. - While PyTorch is mentioned, its version number is not specified, nor are other key software dependencies with versions. |
| Experiment Setup | Yes | We adopt the SGD with a momentum of 0.9 and a weight decay of 5 10 4 as the optimizer to train all attacked DNNs. Specifically, we set the batch size as 128 on CIFAR-10 and GTSRB, while it is set to 32 on Image Net. We set the initial learning rate as 0.1 and train all models 300 epochs. The learning rate will be multiplied by a factor of 0.1 at 100-th epoch. |