SARAD: Spatial Association-Aware Anomaly Detection and Diagnosis for Multivariate Time Series
Authors: Zhihao Dai, Ligang He, Shuanghua Yang, Matthew Leeke
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We present experimental results to demonstrate that SARAD achieves state-of-the-art performance, providing robust anomaly detection and a nuanced understanding of anomalous events. |
| Researcher Affiliation | Academia | Zhihao Dai Department of Computer Science University of Warwick Coventry, UK zhihao.dai@warwick.ac.uk; Ligang He Department of Computer Science University of Warwick Coventry, UK ligang.he@warwick.ac.uk; Shuang-Hua Yang Department of Computer Science University of Reading Reading, UK shuang-hua.yang@reading.ac.uk; Matthew Leeke School of Computer Science University of Birmingham Birmingham, UK m.leeke@bham.ac.uk |
| Pseudocode | No | The paper provides architectural diagrams and mathematical equations (e.g., Eq. 1, 2, 3) but does not include any explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code is available at https://github.com/daidahao/SARAD/. During the review period, code is anonymized and openly available at https://github.com/daidahao/SARAD/ with specific instructions and scripts to reproduce experimental results. |
| Open Datasets | Yes | We evaluate on four real-world datasets collected under industrial control and service monitoring settings. These dataset are: 1) Server Machine Dataset (SMD) (Su et al., 2019b,a), 2) Pooled Server Metrics (PSM) dataset (Abdulaal et al., 2021a,b) 3) Secure Water Treatment (SWa T) dataset (Mathur and Tippenhauer, 2016; i Trust, 2023), and 4) Hardware-In-the-Loop-based Augmented ICS (HAI) dataset (Shin et al., 2021b,a). |
| Dataset Splits | Yes | For hyperparameter tuning, training set is temporally partitioned into 80% for training and 20% for validation. |
| Hardware Specification | Yes | All experiments are run on a single NVIDIA A10 (24GB) GPU. All experiments on time overheads are performed on a compute node with AMD EPYC 7443 (48 cores, 96 threads) CPU, NVIDIA A10 (24GB) GPU, and 512 GB RAM. |
| Software Dependencies | No | The paper states 'We implement SARAD in Python using py Torch library (Paszke et al., 2019) and Hydra framework (Yadan, 2019)', but does not provide specific version numbers for these software dependencies. |
| Experiment Setup | Yes | Adam optimizer (Kingma and Ba, 2015) is used and learning rate is halved every epoch for 3 epochs to prevent over-fitting. The time window size is 2W = 100. The data reconstruction module has H = 8 attention heads per layer with attention length D = 512 and hidden length DF F = 2048. For hyperparameter tuning, training set is temporally partitioned into 80% for training and 20% for validation. The progression module by default has hidden length of DP = 64. We then perform TPE sampling to search weight λLS [10 2, 102] for the progression reconstruction loss LS on the validation set. |