Learning General Gaussian Mixture Model with Integral Cosine Similarity

Authors: Guanglin Li, Bin Li, Changsheng Chen, Shunquan Tan, Guoping Qiu

IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results show that our method is more competitive in modeling data having correlations that may lead to singular covariance matrices in GMM, and it outperforms state-of-the-art methods in unsupervised anomaly detection. In this section, the characteristics of the proposed method are demonstrated by experiments, where the Adam [Kingma and Ba, 2014] optimizer was used.
Researcher Affiliation Academia 1Guangdong Key Laboratory of Intelligent Information Processing and Shenzhen Key Laboratory of Media Security, Shenzhen University, Shenzhen 518060, China. 2Shenzhen Institute of Artificial Intelligence and Robotics for Society, Shenzhen 518129, China
Pseudocode No The paper includes a network diagram (Figure 2) and mathematical formulations but does not provide pseudocode or an algorithm block.
Open Source Code Yes 1The implementation code is available at https://github.com/media-sec-lab/G2M2
Open Datasets Yes To evaluate the unsupervised anomaly detection performance, we conducted experiments on the datasets for outlier detection2, including Lympho, Cardio, Annthyroid, Shuttle, Pima, Pendigits, Satimage2, Arrthymia, Musk, Mnist, and Optdigits. ... 2http://odds.cs.stonybrook.edu/
Dataset Splits No The paper mentions training and testing splits: "In each run, 60% of the data taken by random sampling were used for training, and the remaining data were used for testing." However, it does not specify a separate validation dataset split.
Hardware Specification Yes In this study, G2M2 was implemented1 with Tensorflow (version 1.13.1) and run on a computer equipped with Intel Xeon E5-2640 CPU, 252 GB memory, and NVIDIA 1080Ti GPU (11 GB memory).
Software Dependencies Yes In this study, G2M2 was implemented1 with Tensorflow (version 1.13.1)
Experiment Setup Yes For characteristic evaluations, the learning rate was set to 0.1 and decayed by 0.9 every 20 epochs. For unsupervised anomaly detection, the learning rate was set to 0.01 and decayed by 0.1 every 20 epochs. The total number of training epochs was set to 40, 000 for characteristic evaluations and 80, 000 for unsupervised anomaly detection. The parameters in GMM or UGMF are randomly initialized.