Exchangeable Generative Models with Flow Scans

Authors: Christopher Bender, Kevin O'Connor, Yang Li, Juan Garcia, Junier Oliva, Manzil Zaheer10053-10060

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this work, we develop a new approach to generative density estimation for exchangeable, non-i.i.d. data. ... We achieve new state-of-the-art performance on point cloud and image set modeling. ... In this section, we compare the performance of Flow Scan to that of BRUNO and NS in a variety of exchangeable point cloud and image modeling tasks.
Researcher Affiliation Collaboration Christopher M. Bender, 1 Kevin O Connor,*2 Yang Li,1 Juan Jose Garcia,1 Junier Oliva,1 Manzil Zaheer3 1Department of Computer Science, UNC Chapel Hill 2Department of Statistics and Operations Research, UNC Chapel Hill 3Google Research
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes Further implementation details (including code and Appendices) can be found at https://github.com/lupalab/flowscan.
Open Datasets Yes We consider object classes from the Model Net dataset (Wu et al. 2015)... Each set consists of 50 points sampled uniformly at random from active pixels of a single MNIST (Le Cun et al. 1998) image with uniform noise added to ensure nondegeneracy.
Dataset Splits No The paper mentions using a 'held out test set' but does not provide specific details on the train, validation, and test splits (e.g., percentages, sample counts, or explicit splitting methodology).
Hardware Specification No The paper does not explicitly describe any specific hardware (e.g., GPU/CPU models, memory specifications) used for running its experiments.
Software Dependencies No The paper does not provide specific version numbers for any software dependencies or libraries used.
Experiment Setup No The paper describes the model architecture and training goals but does not provide specific details about hyperparameters (e.g., learning rate, batch size, epochs) or other fine-grained experimental setup settings.