On Unifying Deep Generative Models
Authors: Zhiting Hu, Zichao Yang, Ruslan Salakhutdinov, Eric P. Xing
ICLR 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct preliminary experiments to demonstrate the generality and effectiveness of the importance weighting (IW) and adversarial activating (AA) techniques. In this paper we do not aim at achieving state-of-the-art performance, but leave it for future work. In particular, we show the IW and AA extensions improve the standard GANs and VAEs, as well as several of their variants, respectively. We present the results here, and provide details of experimental setups in the supplements. |
| Researcher Affiliation | Collaboration | Zhiting Hu1,2 Zichao Yang1 Ruslan Salakhutdinov1 Eric P. Xing1,2 Carnegie Mellon University1, Petuum Inc.2 |
| Pseudocode | No | No pseudocode or clearly labeled algorithm block was found in the paper. |
| Open Source Code | No | The paper does not include an unambiguous statement or a direct link indicating the release of open-source code for the described methodology. |
| Open Datasets | Yes | We use MNIST, SVHN, and CIFAR10 for evaluation. |
| Dataset Splits | No | The paper mentions 'test set' and varying 'Train Data Size' (1%, 10%, 100%) for MNIST, and 'cross-validation' for hyperparameter tuning, but it does not provide explicit percentages or sample counts for the training, validation, and test splits needed for reproduction. |
| Hardware Specification | No | The paper does not provide any specific hardware details (e.g., GPU/CPU models, memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'tensorflow library' but does not provide specific version numbers for software dependencies needed to replicate the experiment. |
| Experiment Setup | Yes | The base GAN model is implemented with the DCGAN architecture and hyperparameter setting (Radford et al., 2015). Hyperparameters are not tuned for the IW extensions. We select the best temperature from {1, 1.5, 3, 5} through cross-validation. |