Towards Generalized Implementation of Wasserstein Distance in GANs
Authors: Minkai Xu10514-10522
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Based on the relaxed duality, we further propose a generalized WGAN training scheme named Sobolev Wasserstein GAN, and empirically demonstrate the improvement over existing methods with extensive experiments. |
| Researcher Affiliation | Academia | University of Montreal minkai.xu@umontreal.ca Work performed while at Shanghai Jiao Tong University. |
| Pseudocode | No | The paper states 'We leave the detailed training procedure in Appendix.', but the main body does not contain any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | Yes | Code is available at https://github.com/MinkaiXu/SobolevWassersteinGAN. |
| Open Datasets | Yes | Data. We test different GANs on CIFAR-10 (Krizhevsky, Hinton et al. 2009) and Tiny-Image Net (Deng et al. 2009) , which are standard datasets widely used in GANs literatures. |
| Dataset Splits | No | The paper mentions using 50K samples for evaluation metrics but does not explicitly describe train, validation, or test dataset splits with percentages or sample counts for their experiments. While standard datasets often have predefined splits, the paper does not state it is using them or what those splits are. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions using 'Adam optimizer' but does not specify software dependencies with version numbers (e.g., specific Python, PyTorch, or TensorFlow versions, or other libraries). |
| Experiment Setup | Yes | For SWGAN metaparameter, we choose 8 as the sample size m. Adam optimizer (Kingma and Ba 2014) is set with learning rate decaying from 2 10 4 to 0 over 100K iterations with β1 = 0, β2 = 0.9. We used 5 critic updates per generator update, and the batch size used was 64. |