Learning Latent Space Energy-Based Prior Model

Authors: Bo Pang, Tian Han, Erik Nijkamp, Song-Chun Zhu, Ying Nian Wu

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test the proposed modeling, learning and computing method on tasks such as image synthesis, text generation, as well as anomaly detection. We show that our method is competitive with prior art.
Researcher Affiliation Academia Bo Pang 1 Tian Han 2 Erik Nijkamp 1 Song-Chun Zhu1 Ying Nian Wu1 1University of California, Los Angeles 2Stevens Institute of Technology
Pseudocode Yes Algorithm 1: Learning latent space EBM prior via short-run MCMC.
Open Source Code Yes The one-page code can be found in supplementary materials. and Code to reproduce the reported results is available 2https://bpucla.github.io/latent-space-ebm-prior-project/
Open Datasets Yes For image data, we include SVHN [48], Celeb A [42], and CIFAR-10 [36]. For text data, we include PTB [46], Yahoo [78], and SNLI [5].
Dataset Splits No The paper uses standard datasets but does not explicitly provide specific percentages, counts, or a detailed methodology for splitting data into training, validation, and test sets. It mentions 'test images' and 'test images' without specifying their proportion or how they were derived from the full dataset for reproduction.
Hardware Specification Yes We thank the NVIDIA cooperation for the donation of 2 Titan V GPUs.
Software Dependencies No The paper does not explicitly provide specific software versions for libraries, frameworks, or programming languages used in the experiments.
Experiment Setup Yes Algorithm 1: input :Learning iterations T, learning rate for prior model η0, learning rate for generation model η1, initial parameters θ0 = (α0, β0), observed examples {xi}n i=1, batch size m, number of prior and posterior sampling steps {K0, K1}, and prior and posterior sampling step sizes {s0, s1}.