Black-Box Adversarial Attack on Time Series Classification

Authors: Daizong Ding, Mi Zhang, Fuli Feng, Yuanmin Huang, Erling Jiang, Min Yang

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on three real-world TSC datasets and five DNN based models validate the effectiveness of Black Tree S, e.g., it improves the attack success rate from 19.3% to 27.3%, and decreases the detection success rate from 90.9% to 6.8% for LSTM on the UWave dataset.
Researcher Affiliation Academia Daizong Ding1, Mi Zhang1, Fuli Feng2, Yuanmin Huang1, Erling Jiang1, Min Yang1* 1 School of Computer Science, Fudan University, China 2University of Science and Technology of China {17110240010@, mi zhang@, yuanminhuang21@m., eljiang21@m., m yang@}fudan.edu.cn fulifeng93@gmail.com
Pseudocode Yes Algorithm 1 in Appendix C summarizes the overall framework.
Open Source Code No The paper does not explicitly state that source code for the described methodology is publicly available, nor does it provide any links to a code repository.
Open Datasets No We conduct the experiments on three time series classification datasets: Uwave, Climate and Eye.
Dataset Splits No Table 4 provides the sizes for 'Train Set' and 'Test Set' for each dataset, but there is no explicit mention of a validation set or specific details about the data splitting methodology (e.g., percentages, random seed, cross-validation).
Hardware Specification Yes All the experiments are conducted on a machine with a 20-core CPU, 256GBs of memory and 5 NVIDIA RTX 2080Ti GPUs.
Software Dependencies No The paper mentions optimizers (RMSProp, Adam) and general model types (CNN, RNN, self-attention), but it does not specify software libraries or frameworks with version numbers (e.g., TensorFlow 2.x, PyTorch 1.x) that were used.
Experiment Setup Yes For all DNN based classifiers, the hidden size and the learning rate are set as 20 and 0.005 respectively. The optimizer of RNN is the RMSProp, while the optimizer of the CNN and self-attention model is the Adam (Diederik, Jimmy et al. 2015). For the Black Tree S, the K is 20 and the maximal size of perturbed positions is 100. We adopt a quadtree to perform the tree search strategy. For ϵ, the default value is set as 0.3, which is widely used in previous adversarial attacks on TSC models (Oregi et al. 2018).