Decoupling Global and Local Representations via Invertible Generative Flows
Authors: Xuezhe Ma, Xiang Kong, Shanghang Zhang, Eduard H Hovy
ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on standard image benchmarks demonstrate the effectiveness of our model in terms of density estimation, image generation and unsupervised representation learning. |
| Researcher Affiliation | Academia | 1University of Southern California 2Carnegie Mellon University 3University of California, Berkeley |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code for our model is available at https://github.com/Xuezhe Max/wolf. |
| Open Datasets | Yes | To evaluate our generative model, we conduct two groups of experiments on four benchmark datasets that are commonly used to evaluate deep generative models: CIFAR-10 (Krizhevsky & Hinton, 2009), 64 64 downsampled version Image Net (Oord et al., 2016), the bedroom category in LSUN (Yu et al., 2015) and the Celeb A-HQ dataset (Karras et al., 2018). |
| Dataset Splits | No | The paper mentions using standard benchmark datasets (CIFAR-10, Image Net, LSUN, Celeb A-HQ) but does not explicitly provide their specific training, validation, or test dataset splits (e.g., percentages or sample counts) within the main text or appendices. |
| Hardware Specification | Yes | Table 6 provides the number of parameters of different models on CIFAR-10, together with the corresponding training time over one epoch (measured on four Tesla V100 GPUs). |
| Software Dependencies | No | The paper mentions the use of the Adam optimizer and various architectures but does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | Table 5: Hyper-parameters in our experiments. Dataset batch size latent dim dz weight decay # updates of warmup CIFAR-10, 32 32 512 64 1e 6 50 Image Net, 64 64 256 128 5e 4 200 LSUN, 128 128 256 256 5e 4 200 Celeb A-HQ, 256 256 40 256 5e 4 200 |