Blurring Diffusion Models
Authors: Emiel Hoogeboom, Tim Salimans
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 6 EXPERIMENTS |
| Researcher Affiliation | Industry | Emiel Hoogeboom Google Research, Brain Team, Amsterdam, Netherlands Tim Salimans Google Research, Brain Team, Amsterdam, Netherlands |
| Pseudocode | Yes | A.1 PSEUDO-CODE OF DIFFUSION AND DENOISING PROCESS |
| Open Source Code | No | An example of a denoising diffusion implementation https://github.com/w86763777/pytorch-ddpm |
| Open Datasets | Yes | CIFAR10 dataset (Krizhevsky et al., 2009) |
| Dataset Splits | No | No explicit mention of a validation dataset split used for training or hyperparameter tuning. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory) used for running experiments are provided. |
| Software Dependencies | No | No specific software dependencies with version numbers are mentioned. |
| Experiment Setup | Yes | All models where optimized with Adam, with a learning rate of 2 10 4 and batch size 128 for CIFAR-10 and a learning rate of 1 10 4 and batch size 256 for the LSUN models. All methods are evaluated with an exponential moving average computed with a decay of 0.9999. |