VFlow: More Expressive Generative Flows with Variational Data Augmentation
Authors: Jianfei Chen, Cheng Lu, Biqi Chenli, Jun Zhu, Tian Tian
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | For image density modeling on the CIFAR-10 dataset, VFlow achieves a new state-of-the-art 2.98 bits per dimension. We first evaluate VFlow on a toy DX = 2 Checkerboard dataset... We study the impact of the dimensionality of the flow DX + DZ {2, 4, 6, 8, 10}... We evaluate VFlow on CIFAR-10 and Image Net for density estimation of images. |
| Researcher Affiliation | Collaboration | 1Department of Computer Science and Technology, Institute for AI, BNRist Center, Tsinghua University 2Real AI. |
| Pseudocode | No | The paper does not contain any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is open-sourced at https://github.com/thu-ml/vflow. |
| Open Datasets | Yes | For image density modeling on the CIFAR-10 dataset... We evaluate VFlow on CIFAR-10 and Image Net (Russakovsky et al., 2015) for density estimation of images. |
| Dataset Splits | Yes | For this set of experiments, we randomly hold out 10,000 samples from the training set for validation. |
| Hardware Specification | Yes | All the experiments are run on 16 RTX 2080Ti GPUs. |
| Software Dependencies | No | The paper mentions using an "Adam optimizer (Kingma & Ba, 2015)" but does not provide specific version numbers for software libraries, frameworks, or programming languages used (e.g., Python, PyTorch, TensorFlow, CUDA versions). |
| Experiment Setup | Yes | The model is trained with an Adam optimizer (Kingma & Ba, 2015) with a batch size 64 for 2,000 epochs. Following (Ho et al., 2019), the learning rate linearly warms up to 0.0012 during the first 2,000 training steps, and exponentially decays at a rate of 0.99999 per step starting from the 50,000-th step until it reaches 0.0003. |