Learning Energy-based Model via Dual-MCMC Teaching

Authors: Jiali Cui, Tian Han

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we address the following questions: (1) Can our method learn an EBM with high-quality synthesis? (2) Can both the complementary generator and inference model successfully match their MCMC-revised samples? and (3) What is the influence of the inference model and generator model? We refer to implementation details and additional experiments in Appendix. 5.1 Image Modelling We first evaluate the EBM in image data modelling.
Researcher Affiliation Academia Jiali Cui, Tian Han Department of Computer Science, Stevens Institute of Technology {jcui7,than6}@stevens.edu
Pseudocode Yes We present the learning algorithm in Appendix.
Open Source Code No The paper does not provide an explicit statement or link for open-source code availability.
Open Datasets Yes We benchmark our method on standard datasets such as CIFAR-10 [23] and Celeb A-64 [25], as well as challenging high-resolution Celeb A-HQ-256 [19] and large-scale LSUN-Church-64 [44].
Dataset Splits No The paper mentions using standard datasets like CIFAR-10 and Celeb A-64 but does not explicitly provide the specific train/validation/test splits used for the experiments.
Hardware Specification No The paper does not provide specific details about the hardware used to run the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers.
Experiment Setup Yes We train our model on Celeb A-64 using Langevin steps kx = 30 for the MCMC revision on x and kz = 10 for the MCMC revision on z.