Deep Orthogonal Hypersphere Compression for Anomaly Detection

Authors: Yunhe Zhang, Yan Sun, Jinyu Cai, Jicong Fan

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The numerical and visualization results on benchmark datasets demonstrate the superiority of our methods in comparison to many baselines and state-of-the-art methods.
Researcher Affiliation Academia Yunhe Zhang1,2 Yan Sun1,3 Jinyu Cai4 Jicong Fan1,2 1School of Data Science, The Chinese University of Hong Kong, Shenzhen, China 2Shenzhen Research Institute of Big Data, Shenzhen, China 3School of Computing, National University of Singapore, Singapore 4Institute of Data Science, National University of Singapore, Singapore zhangyhannie@gmail.com yansun@comp.nus.edu.sg jinyucai1995@gmail.com fanjicong@cuhk.edu.cn
Pseudocode Yes Algorithm 1 Deep Orthogonal Hypersphere Contraction (DOHSC) ... Algorithm 2 Deep Orthogonal Bi-Hypersphere Compression (DO2HSC) ... Algorithm 3 Graph-Level Deep Orthogonal Hypersphere Contraction
Open Source Code Yes Our source code is available at https://github.com/wownice333/DOHSC-DO2HSC.
Open Datasets Yes Datasets: Two image datasets (Fashion-MNIST, CIFAR-10) are chosen to conduct this experiment. Please refer to the detailed statistic descriptions in Appendix F. ... We further evaluate our models on six real-world graph datasets2 (COLLAB, COX2, ER MD, MUTAG, DD and IMDB-Binary). Our experiments followed the standard one-class settings and data-split method in a previous work (Zhao & Akoglu, 2021; Qiu et al., 2022).
Dataset Splits Yes Regarding the image and tabular datasets, the data splits are already provided in the paper. The detailed statistical information of all tested datasets is given in Tables 5 and 6. ... In the graph experiment, we adopted the classical AD method, One-Class SVM (OCSVM) (Sch olkopf et al., 2001), to compare graph-kernel baselines and used 10-fold cross-validation to make a fair comparison.
Hardware Specification No The paper does not provide specific hardware details used for running its experiments.
Software Dependencies Yes All graph kernels extract a kernel matrix via Gra Kel (Siglidis et al., 2020) and apply the OCSVM in scikit-learn (Pedregosa et al., 2011).
Experiment Setup Yes Regarding our DOHSC model, we set 10 epochs in the pretraining stage to initialize the center of the decision boundary and then train the model in 200 epochs. The percentile ν of r was selected from {0.001, 0.003, 0.005, 0.008, 0.01, 0.03, 0.1, 0.3}. The improved method DO2HSC also sets a 10epoch pretraining stage and trains DOHSC for 50 epochs to initialize a suitable center and decision boundaries rmax and rmin, where the percentile ν of rmax is the same as DOHSC. The main training epoch was set to 200. ... The dimensions of the GIN hidden and orthogonal projection layers were fixed at 16 and 8, respectively. For the backbone network, a 4-layer GIN and a 3-layer fully connected neural network were adopted.