Mixture of GANs for Clustering

Authors: Yang Yu, Wen-Ji Zhou

IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experiments show that the proposed GANMM can have good performance on complex data as well as simple data.
Researcher Affiliation Academia Yang Yu and Wen-Ji Zhou National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China {yuy,zhouwj}@lamda.nju.edu.cn
Pseudocode Yes Algorithm 1 GAN mixture model learning algorithm
Open Source Code Yes An implementation of GANMM can be found at https://github.com/eyounx/GANMM.
Open Datasets Yes On MNIST Dataset [Le Cun et al., 1998]. It is a handwriting digital dataset containing 60,000 images of size 28 by 28 pixels consist of 10 classes from digit 0 to 9 ." and "We finally compare the clustering performance on two UCI datasets [Dua and Karra, 2017].
Dataset Splits No The paper does not provide specific training, validation, and test dataset splits with percentages or counts, nor does it refer to standard predefined splits for reproducibility.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory, or processing units) used for running its experiments.
Software Dependencies No The paper mentions using Wasserstein GAN and DEC implementations from GitHub but does not provide specific version numbers for these or other software dependencies.
Experiment Setup No The paper describes network architectures (e.g., 'two convolution layers and two dense layers') and parameters in Algorithm 1 (e.g., 'learning rate', 'number of epoch for GANs'), but does not provide specific numerical values for these hyperparameters or other system-level training settings.