Raising the Bar in Graph-level Anomaly Detection
Authors: Chen Qiu, Marius Kloft, Stephan Mandt, Maja Rudolph
IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on nine real-world data sets involving nine techniques reveal that our method achieves an average performance improvement of 11.8% AUC compared to the best existing approach. |
| Researcher Affiliation | Collaboration | 1Bosch Center for Artificial Intelligence 2TU Kaiserslautern, Germany 3University of California, Irvine, USA |
| Pseudocode | No | No structured pseudocode or algorithm blocks with explicit labels were found. |
| Open Source Code | Yes | Code is available at https://github.com/boschresearch/Graph Level-Anomaly Detection |
| Open Datasets | Yes | The datasets are made available by Morris et al. [2020], and the statistics of the datasets are given in Appendix A. |
| Dataset Splits | Yes | For each experimental variant, 10% of the normal class is set aside for the test set, and 10% of each of the other classes is added to the test set as anomalies. (The resulting fraction of anomalies in the test set is proportional to the class balance in the original dataset. The remaining 90% of the normal class is used for training and validation. We use 10-fold cross-validation to estimate the model performance. In each fold, 10% of the training set is held out for validation. |
| Hardware Specification | No | Only "GPU clusters" are mentioned without specific model numbers or detailed hardware specifications. |
| Software Dependencies | No | No specific software dependencies with version numbers (e.g., Python, PyTorch, CUDA versions, or library versions) are mentioned. |
| Experiment Setup | Yes | In particular, we use 4 GIN layers, each of which includes a two-layer MLP and graph normalization [Cai et al., 2020]. The dimension of the node representations is 32. The readout function of almost all methods consists of a two-layer MLP and then an add pooling layer. ... Additional hyperparameter settings are recorded for reproducibility in Appendix B. |