CcGAN: Continuous Conditional Generative Adversarial Networks for Image Generation

Authors: Xin Ding, Yongwei Wang, Zuheng Xu, William J Welch, Z. Jane Wang

ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments on the Circular 2-D Gaussians, RC-49, and UTKFace datasets show that Cc GAN is able to generate diverse, high-quality samples from the image distribution conditional on a given regression label. Moreover, in these experiments, Cc GAN substantially outperforms c GAN both visually and quantitatively.
Researcher Affiliation Academia Xin Ding , Yongwei Wang , Zuheng Xu, William J. Welch, Z. Jane Wang, The University of British Columbia {xin.ding@stat, yongweiw@ece, zuheng.xu@stat, will@stat, zjanew@ece}.ubc.ca
Pseudocode Yes Algorithm 1: An algorithm for Cc GAN training with the proposed HVDL. Algorithm 2: An algorithm for Cc GAN training with the proposed SVDL.
Open Source Code Yes Please find the codes for this paper at Github: https://github.com/UBCDing Xin/improved_Cc GAN
Open Datasets Yes Our experiments on the Circular 2-D Gaussians, RC-49, and UTKFace datasets show that Cc GAN is able to generate diverse, high-quality samples from the image distribution conditional on a given regression label. Moreover, in these experiments, Cc GAN substantially outperforms c GAN both visually and quantitatively. The UTKFace dataset is an age regression dataset (Zhang et al., 2017), with human face images collected in the wild.
Dataset Splits No The paper describes training and testing sets, but a distinct 'validation' split (e.g., with specific percentages or counts for model tuning during training) is not explicitly provided for all experiments. For RC-49, it states "The remaining images are held out for evaluation," which could include validation but is not explicitly defined as such.
Hardware Specification No The paper does not provide specific hardware details (e.g., exact GPU models, CPU types, or memory) used for running the experiments. It mentions using architectures like SNGAN and BigGAN but no underlying hardware.
Software Dependencies No The paper mentions using the Adam optimizer, and Blender v2.79 for data generation, but it does not provide specific version numbers for other key software libraries or dependencies (e.g., Python, PyTorch, TensorFlow) used for the GAN training itself.
Experiment Setup Yes Both c GAN and Cc GAN are trained for 6,000 iterations. We use the rule of thumb formulae in Supp. S.4 to select the hyper-parameters of HVDL and SVDL, i.e., σ ≈ 0.074, κ ≈ 0.017 and ν = 3600. Both c GAN and Cc GAN are trained for 30,000 iterations with batch size 256. Both c GAN and Cc GAN are trained for 40,000 iterations with batch size 512. Adam (Kingma & Ba, 2015) optimizer (with β1 = 0.5 and β2 = 0.999), a constant learning rate 10^-4.