Generative Flows with Matrix Exponential

Authors: Changyi Xiao, Ligang Liu

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments show that our model achieves great performance on density estimation amongst generative flows models. We evaluate our MEF model on CIFAR10 (Krizhevsky et al., 2009), Image Net32 and Image Net64 (Van Oord et al., 2016) datasets and compare log-likelihood with other generative flows models.
Researcher Affiliation Academia Changyi Xiao 1 Ligang Liu 1 University of Science and Technology of China. Correspondence to: Ligang Liu <lgliu@ustc.edu.cn>
Pseudocode Yes Algorithm 1 Algorithm for computing matrix exponential
Open Source Code Yes The code for our model is available at https://github.com/changyi7231/MEF.
Open Datasets Yes We evaluate our MEF model on CIFAR10 (Krizhevsky et al., 2009), Image Net32 and Image Net64 (Van Oord et al., 2016) datasets
Dataset Splits No The paper mentions training and test sets but does not explicitly describe a separate validation set or its split methodology.
Hardware Specification Yes All models are trained on one TITAN Xp GPU. Our CIFAR10 model takes 1.67 seconds to generate a batch of 64 samples on one NVIDIA 1080 Ti GPU.
Software Dependencies No The paper mentions optimization methods like Adamax and activation functions like ELU, but does not provide specific version numbers for software dependencies such as Python, PyTorch/TensorFlow, or CUDA.
Experiment Setup Yes We use a level L = 3 and depth D1 = 8, D2 = 4, D3 = 2. Each coupling layer is composed of 8 residual blocks... All models are trained for 50 epochs with batch size 64. We run models on CIFAR10 dataset with learning rate 0.01 and 0.001.