Flows for simultaneous manifold learning and density estimation

Authors: Johann Brehmer, Kyle Cranmer

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In a range of experiments we demonstrate how M-flows learn the data manifold and allow for better inference than standard flows in the ambient data space.
Researcher Affiliation Academia Johann Brehmer and Kyle Cranmer New York University johann.brehmer@nyu.edu, kyle.cranmer@nyu.edu
Pseudocode No No structured pseudocode or algorithm blocks are present in the paper.
Open Source Code Yes The code used in our study is available at http://github.com/johannbrehmer/manifold-flow.
Open Datasets Yes We generate these with a Style GAN2 [25] model trained on the FFHQ dataset [26], sampling n of the GAN latent variables while keeping all others fixed. ... In addition, we use the real-world Celeb A-HQ dataset [26].
Dataset Splits No The paper mentions training and test data, but does not explicitly provide details for a separate validation split or how it was used beyond general evaluation.
Hardware Specification No Funding disclosure: This work was supported in part through the NYU IT High Performance Computing resources, services, and staff expertise. (This is a general statement and lacks specific hardware details like GPU/CPU models.)
Software Dependencies Yes We are grateful to the authors and maintainers of DELPHES 3 [32], ... PYTHIA8 [39], ... PYTORCH [40], PYTORCH-FID [41], SCIKIT-LEARN [42], and SCIPY [43].
Experiment Setup Yes All models are based on rational-quadratic neural spline flows [17]. For tabular datasets, we construct transformations f and h by alternating coupling layers with either random permutations or invertible linear transformations, using between 20 and 35 coupling layers depending on the dataset. For image data, f is based on a multi-scale architecture [5] with between 20 and 28 coupling layers across four levels interspersed with actnorm layers and 1 1 convolutions, closely following Refs. [17, 18].