On Diffusion Modeling for Anomaly Detection
Authors: Victor Livernoche, Vineet Jain, Yashar Hezaveh, Siamak Ravanbakhsh
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through empirical evaluations on the ADBench benchmark, we demonstrate that all diffusion-based anomaly detection methods perform competitively for both semi-supervised and unsupervised settings. |
| Researcher Affiliation | Academia | 1 School of Computer Science, Mc Gill University 2 Department of Physics, University of Montreal 3 Mila Quebec AI Institute Equal contribution |
| Pseudocode | Yes | Algorithm 1 Training Process for parametric DTE |
| Open Source Code | Yes | Code available at https://github.com/vicliv/DTE |
| Open Datasets | Yes | We perform experiments on the ADBench benchmark (Han et al., 2022), which comprises a set of popular tabular anomaly detection datasets as well as newly created tabular datasets made from images and natural language tasks, all described in Appendix D.1. |
| Dataset Splits | No | The paper describes training and test data configurations and mentions hyperparameter tuning, but it does not explicitly provide specific percentages or counts for a separate validation dataset split. |
| Hardware Specification | Yes | The total amount of compute required to reproduce our experiments with five seeds, including all of the baselines and the proposed DTE model amounts to 473 GPU-hours for the unsupervised setting and 225 GPU-hours for the semi-supervised setting on an RTX8000 GPU with 48 gigabytes of memory for running the ADBench datasets. |
| Software Dependencies | No | The paper lists hyperparameters and model architectures but does not specify software dependencies with version numbers (e.g., Python, PyTorch, or TensorFlow versions). |
| Experiment Setup | Yes | Table 2: Hyperparameters for parametric DTE model (Hidden layer sizes [256, 512, 256] Activation function ReLU Optimizer Adam Learning rate 0.0001 Dropout 0.5 Batch size 64 Number of epochs 400 Maximum timestep 300 Number of bins 7) |