Learning Interpretable Representations with Informative Entanglements

Authors: Ege Beyazıt, Doruk Tuncel, Xu Yuan, Nian-Feng Tzeng, Xindong Wu

IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Upon qualitatively and quantitatively evaluating the proposed method using both synthetic and real-world datasets, we show that our proposed regularizer guides GANs to learn representations with disentanglement scores competing with the state-of-the-art, while extracting a wider variety of salient features.
Researcher Affiliation Collaboration Ege Beyazit1 , Doruk Tuncel2 , Xu Yuan1 , Nian-Feng Tzeng1 and Xindong Wu3 1University of Louisiana at Lafayette, Lafayette, LA, USA 2Johannes Kepler University Linz, Linz, Austria 3Mininglamp Academy of Sciences, Beijing, China
Pseudocode No The paper describes the methodology with mathematical formulations and textual explanations but does not include structured pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide a specific repository link or an explicit statement about the release of source code for the described methodology.
Open Datasets Yes MNIST [Le Cun et al., 2010] consists of 70, 000 28 28 grayscale images of handwritten digits... 3D faces dataset [Paysan et al., 2009] contains 240, 000 face models... d Sprites Dataset [Matthey et al., 2017] consists of 737, 280 64 64 images of sprites
Dataset Splits No The paper mentions tuning parameters using grid search but does not provide specific details on train/validation/test splits (e.g., percentages, sample counts, or explicit standard split references) for its experiments.
Hardware Specification No The paper does not provide specific hardware details such as exact GPU/CPU models, processor types, or memory amounts used for running its experiments. It only mentions general experimental setup.
Software Dependencies No The paper discusses various models and frameworks (e.g., Info GAN, GANs, Bayesian network) but does not provide specific version numbers for any key software components or libraries required to replicate the experiments.
Experiment Setup Yes In our experiments, the proposed regularization is implemented on top of the same discriminator and generator architectures of Info GAN, while tuning the parameters using grid search... To train a GAN using our proposed regularizer, we set G to the graph structure shown in Figure 2a for the continuous random variables, and set the regularization weight λ to 0.2... We set our method s generator and discriminator learning rates to 5e 4 and 2e 4 respectively. We set λ = 0.1 and G to the graph structure shown in Figure 2a... We set the dimension of the input noise vector z to 52 for our proposed method, then train both models for 100 epochs.