Neural Spline Flows

Authors: Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate that neural spline flows improve density estimation, variational inference, and generative modeling of images.In our experiments, the neural network NN which computes the parameters of the elementwise transformations is a residual network [18] with pre-activation residual blocks [19].
Researcher Affiliation Academia Conor Durkan Artur Bekasov Iain Murray George Papamakarios School of Informatics, University of Edinburgh {conor.durkan, artur.bekasov, i.murray, g.papamakarios}@ed.ac.uk
Pseudocode No The paper describes the practical implementation steps of the monotonic rational-quadratic coupling transform using numbered points in Section 3.1, but it does not present this information in a formally labeled 'Pseudocode' or 'Algorithm' block.
Open Source Code Yes Code is available online at https://github.com/bayesiains/nsf.
Open Datasets Yes We first evaluate our proposed flows using a selection of datasets from the UCI machine-learning repository [7] and BSDS300 collection of natural images [38]. For our experiments, we use dynamically binarized versions of the MNIST dataset of handwritten digits [33], and the EMNIST dataset variant featuring handwritten letters [5]. We use the CIFAR-10 [31] and downsampled 64x64 Image Net [49, 60] datasets.
Dataset Splits Yes For validation results which can be used for comparison during model development, see table 6 in appendix B.1.
Hardware Specification Yes Each GPU experiment was run on a single NVIDIA GeForce GTX 1080 Ti with 11GB of RAM.
Software Dependencies Yes All experiments are run using PyTorch 1.1 with CUDA 10.0 and cuDNN 7.4.1.
Experiment Setup Yes Unless otherwise stated, we use a constant learning rate of 10^-3, batch size 200, 10 flow steps, K = 8 bins for the splines and a tail bound B = 3. We train each model for 500 epochs for UCI datasets, and 200 epochs for MNIST and EMNIST datasets.