Perturbation Guiding Contrastive Representation Learning for Time Series Anomaly Detection
Authors: Liaoyuan Tang, Zheng Wang, Guanxiong He, Rong Wang, Feiping Nie
IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on six realworld datasets demonstrate the significant superiority of our model over thirteen state-of-the-art competitors, and obtains average 5.14%, 8.24% improvement in F1 score and AUC-PR, respectively. |
| Researcher Affiliation | Academia | School of Artificial Intelligence, Optics and Electronics (i OPEN), Northwestern Polytechnical University, Xi an, China tangly@mail.nwpu.edu.cn, zhengwangml@gmail.com, heguanx@mail.nwpu.edu.cn, wangrong07@tsinghua.org.cn, feipingnie@gmail.com |
| Pseudocode | No | The paper includes diagrams and descriptions of its components, but no section or figure is explicitly labeled as 'Pseudocode' or 'Algorithm', nor is any structured algorithm presented in a code-like format. |
| Open Source Code | No | The paper does not contain an explicit statement about releasing the source code for the described methodology, nor does it provide a link to a code repository. |
| Open Datasets | Yes | Six real datasets showed in Table 1 for time series anomaly detection are adopted, including Wa Q, DSADS, Epilepsy, ASD, SMD1, and SWa T[Zhang et al., 2022; Xu et al., 2024]. ... The Wa Q, Epilepsy, DSADS, ASD, SMD, and SWa T datasets have been widely used as benchmark datasets in previous literature [Lai et al., 2021; Xu et al., 2024; Campos et al., 2021; Deng et al., 2021; Li et al., 2021; Su et al., 2019; Tuli et al., 2022]. |
| Dataset Splits | Yes | Table 1 lists 'Training size' and 'Testing size' for each dataset. For Epilepsy and DSADS, their original data formats are collections of partitioned time series, hence no sliding window is needed. The Wa Q, Epilepsy, DSADS, ASD, SMD, and SWa T datasets have been widely used as benchmark datasets in previous literature. |
| Hardware Specification | Yes | All the experiments are executed at a workstation with the Intel(R) Xeon(R) Gold 6248R 3.00GHz CPU and NVIDIA TITAN Xp GPU with 10GB RAM, running on the Ubuntu 20.04 operating system. |
| Software Dependencies | No | The paper mentions the operating system ('Ubuntu 20.04 operating system') but does not specify other software dependencies like programming languages, libraries, or frameworks with their version numbers. |
| Experiment Setup | Yes | We train models for up to 100 epochs, and the weight factor α of the contrastive network is set to 0.2 by default. We choose k = 5 as the parameter for selecting positive samples using nearest neighbors. |