Toward Understanding Generative Data Augmentation

Authors: Chenyu Zheng, Guoqiang Wu, Chongxuan LI

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Simulation results on the Gaussian mixture model and empirical results on generative adversarial nets support our theoretical conclusions. Our code is available at https://github.com/ML-GSAI/Understanding-GDA.
Researcher Affiliation Academia Chenyu Zheng1,2, Guoqiang Wu3, Chongxuan Li1,2 1 Gaoling School of Artificial Intelligence, Renmin University of China, Beijing, China 2 Beijing Key Laboratory of Big Data Management and Analysis Methods, Beijing, China 3 School of Software, Shandong University, Shandong, China
Pseudocode No The paper does not contain any explicit pseudocode blocks or algorithms.
Open Source Code Yes Our code is available at https://github.com/ML-GSAI/Understanding-GDA.
Open Datasets Yes In this part, we conduct experiments on the real CIFAR-10 dataset [52] with Res Nets [54] and various deep generative models, including conditional DCGAN (c DCGAN) [55], Style GAN2-ADA [56] and elucidating diffusion model (EDM) [30].
Dataset Splits No The paper mentions training on a "train set" and evaluating on a "test set" but does not specify a validation set or explicit train/validation/test split percentages or counts.
Hardware Specification Yes All experiments are conducted on 8 NVIDIA GeForce RTX 3090 GPUs.
Software Dependencies Yes Our code is implemented with PyTorch [81] and Python 3.8.10.
Experiment Setup Yes Batch size is set to 128. We train 200 epochs for all classifiers.