SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Authors: Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the efficacy of Sur VAE Flows with experiments on synthetic datasets, point cloud data, and images. Code to implement Sur VAE Flows and reproduce results is publicly available. ... 4 Experiments ... We investigate the ability of Sur VAE flows to model data that is difficult to model with normalizing flows. ... Table 4: Unconditional image modeling results in bits/dim. ... Table 5: Inception score and FID for CIFAR-10.
Researcher Affiliation Academia Technical University of Denmark1, Uv A-Bosch Delta Lab, University of Amsterdam2
Pseudocode Yes Algorithm 1: log likelihood(x)
Open Source Code Yes The code is available at https://github.com/didriknielsen/survae_flows
Open Datasets Yes We demonstrate the efficacy of Sur VAE Flows with experiments on synthetic datasets, point cloud data, and images. ... We use the Spatial MNIST dataset (Edwards and Storkey, 2017)... We train a flow using 2 scales with 12 steps/scale for CIFAR-10 and Image Net 32 32 and 3 scales with 8 steps/scale for Image Net 64 64.
Dataset Splits No No specific training/validation/test dataset splits (e.g., percentages, sample counts, or explicit mention of standard splits) are provided in the paper for the datasets used.
Hardware Specification Yes This research was supported by the NVIDIA Corporation with the donation of TITAN X GPUs.
Software Dependencies No No specific software dependencies with version numbers (e.g., PyTorch 1.9, Python 3.8) are explicitly mentioned in the paper's text.
Experiment Setup No No specific experimental setup details such as hyperparameter values (e.g., learning rate, batch size) are provided in the main text of the paper. Such details are deferred to the appendices.