A Flexible Diffusion Model

Authors: Weitao Du, He Zhang, Tao Yang, Yuanqi Du

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We present numerical experiments on synthetic datasets, MNIST and CIFAR10 to validate the effectiveness of our framework.
Researcher Affiliation Academia 1Academy of Mathematics and Systems Science, Chinese Academy of Sciences 2Institute of Artificial Intelligence and Robotics, Xi an Jiaotong University 3Cornell University.
Pseudocode No The paper describes algorithmic steps in prose and equations but does not present any explicitly labeled 'Pseudocode' or 'Algorithm' blocks.
Open Source Code No The paper does not provide any explicit statements about making the source code available or include a link to a code repository.
Open Datasets Yes In this section, we demonstrate the generative capacity of our FP-Diffusion models on two common image datasets: MNIST (Le Cun, 1998) and CIFAR10 (Krizhevsky et al., 2009).
Dataset Splits No The paper describes a 'two-stage training strategy' but does not explicitly provide percentages, sample counts, or citations for specific training/validation/test dataset splits.
Hardware Specification Yes All the experiments are conducted on 4 Nvidia Tesla V100 16G GPUs.
Software Dependencies No The paper mentions using the Adam optimizer but does not specify version numbers for any software components, libraries, or programming languages used in the experiments.
Experiment Setup Yes For all experiments, we set βmax as 20 and βmin as 0.1... All models are trained with the Adam optimizer with a learning rate 2 × 10−4 and a batch size 96. In the MNIST experiment, we first train the whole model for 50k iterations and train the score model for another 250k iterations... In the CIFAR10 experiment, the training iterations of both stage 1 and stage 2 are 600k.