Diffusion Models as Plug-and-Play Priors

Authors: Alexandros Graikos, Nikolay Malkin, Nebojsa Jojic, Dimitris Samaras

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We first explore the idea of generating conditional samples from an unconditional diffusion model on MNIST. We train the DDPM model of [7] on MNIST digits and experiment with different sets of constraints log c(x, y) to generate samples with specific attributes.
Researcher Affiliation Collaboration Alexandros Graikos Stony Brook University Stony Brook, NY agraikos@cs.stonybrook.edu Nikolay Malkin Mila, Université de Montréal Montréal, QC, Canada nikolay.malkin@mila.quebec Nebojsa Jojic Microsoft Research Redmond, WA jojic@microsoft.com Dimitris Samaras Stony Brook University Stony Brook, NY samaras@cs.stonybrook.edu
Pseudocode Yes Algorithm 1 Inferring a point estimate of p(x|y) δ(x η), under a DDPM prior and constraint.
Open Source Code Yes The code is available at https://github.com/Alex Graikos/diffusion_priors.
Open Datasets Yes We train the DDPM model of [7] on MNIST digit images and experiment with different sets of constraints log c(x, y) to generate samples with specific attributes. We utilize the pretrained DDPM network on FFHQ-256 [19] from [3] and a pretrained Res Net-18 face attribute classifier on Celeb A [25]. For this purpose, we use the Enviro Atlas dataset [32]. We use a dataset of Euclidean TSPs, with ground truth tours obtained by a state-of-the-art TSP solver [10], from [23].
Dataset Splits No The main text does not explicitly provide specific percentages, counts, or methods for training/validation/test dataset splits. While the checklist indicates this information is in the Appendix, it is not present in the provided paper text.
Hardware Specification No The main text of the paper does not specify the exact hardware used (e.g., specific GPU or CPU models). The checklist indicates this information can be found in the Appendix, which is not provided.
Software Dependencies No The paper does not explicitly list specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9). While deep learning frameworks are implied by the models used, no explicit versioning is provided in the main text.
Experiment Setup Yes Algorithm 1 outlines key experimental setup details: 'input pretrained DDPM ϵθ, auxiliary data y, constraint c, time schedule (ti)T i=1, learning rate λ'. It also specifies 'Initialize x N(0; I)' and shows the update rule including the learning rate: 'x x λ x[ ϵ ϵθ(xti, ti) 2 2 log c(x, y)]'.