On Analyzing Generative and Denoising Capabilities of Diffusion-based Deep Generative Models

Authors: Kamil Deja, Anna Kuzina, Tomasz Trzcinski, Jakub Tomczak

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We experimentally validate our proposition, showing its pros and cons. ... Third, we empirically assess the performance of DDGMs and DAED on three datasets (Fashion MNIST, CIFAR10, Celeb A) in terms of data generation and transferability (i.e., how DDGMs behave on different data distribution).
Researcher Affiliation Collaboration Kamil Deja Warsaw University of Technology Warsaw, Poland kamil.deja@pw.edu.pl Anna Kuzina Vrije Universiteit Amsterdam Amsterdam, the Netherlands a.kuzina@vu.nl Tomasz Trzci nski Warsaw University of Technology Jagiellonian University of Cracow Tooploox, IDEAS NCBR tomasz.trzcinski@pw.edu.pl Jakub M. Tomczak Vrije Universiteit Amsterdam Amsterdam, the Netherlands j.m.tomczak@vu.nl
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code Yes All implementation details and hyperparameters are included in the Appendix ( A.4) and code repository 3. 3https://github.com/Kamil Deja/analysing_ddgm
Open Datasets Yes We run experiments on three standard benchmarks with different complexity: Fashion MNIST [31] of gray-scale 28 28 images, CIFAR10 [13] of 32 32 natural images, and Celeb A [14] of 64 64 photographs of faces.
Dataset Splits No The paper states that training details, including data splits, are included in the Appendix (A.4) but does not provide specific train/validation/test split percentages or sample counts in the main text.
Hardware Specification No Computation was carried out on the Dutch national e-infrastructure with the support of SURF Cooperative. The checklist indicates hardware details are in the supplementary material but not provided in the main text.
Software Dependencies No The paper does not provide specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9) in the main text.
Experiment Setup No All implementation details and hyperparameters are included in the Appendix ( A.4) and code repository 3. The main text does not contain specific hyperparameter values or detailed training configurations.