LogAnomaly: Unsupervised Detection of Sequential and Quantitative Anomalies in Unstructured Logs
Authors: Weibin Meng, Ying Liu, Yichen Zhu, Shenglin Zhang, Dan Pei, Yuqing Liu, Yihao Chen, Ruizhi Zhang, Shimin Tao, Pei Sun, Rong Zhou
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our evaluation on two public production log datasets show that Log Anomaly outperforms existing log-based anomaly detection methods. |
| Researcher Affiliation | Collaboration | 1Tsinghua University 2University of Toronto 3Nankai University 4Huawei 5Beijing National Research Center for Information Science and Technology (BNRist) |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks (no clearly labeled algorithm sections or code-like formatted procedures). |
| Open Source Code | No | The paper does not provide concrete access to source code (specific repository link, explicit code release statement, or code in supplementary materials) for the methodology described. |
| Open Datasets | Yes | We conduct experiments over the BGL dataset [Oliner and Stearley, 2007] and the HDFS dataset [Xu et al., 2009] |
| Dataset Splits | No | The paper states, 'we leverage the front 80% (according to the timestamps of logs) as the training data, and the rest 20% as the testing data.' This describes a train/test split but does not explicitly mention a validation set or its details. |
| Hardware Specification | Yes | We conduct all the experiments on a Linux server with Intel Xeon 2.40 GHz CPU and 64G memory. |
| Software Dependencies | Yes | We implement Log Anomaly and Deep Log with Python 3.6 and Keras 2.1. |
| Experiment Setup | Yes | The Log Anomaly in our experiments has two LSTM layers with 128 neurons, and the size (step) of window is 20 (1). |