GAN Memory with No Forgetting
Authors: Yulai Cong, Miaoyun Zhao, Jianqiao Li, Sijia Wang, Lawrence Carin
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments demonstrate the superiority of our method over existing approaches and its effectiveness in alleviating catastrophic forgetting for lifelong classification problems. |
| Researcher Affiliation | Academia | Yulai Cong Miaoyun Zhao Jianqiao Li Sijia Wang Lawrence Carin Department of Electrical and Computer Engineering Duke University |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code is available at https:// github.com/Miaoyun Zhao/GANmemory_Lifelong Learning. |
| Open Datasets | Yes | To demonstrate the superiority of our GAN memory over existing replay-based methods, we design a challenging lifelong generation problem consisting of 6 perceptually-distant tasks/datasets (see Figure 5): Flowers [57], Cathedrals [99], Cats [97], Brain-tumor images [15], Chest X-rays [35], and Anime faces.10 The GP-GAN [49] trained on the Celeb A [43] (D0) is selected as the base; other well-behaved GAN models may readily be considered. |
| Dataset Splits | No | The paper mentions training data and testing performance but does not specify explicit train/validation/test dataset splits or their sizes, or reference a standard split that includes a validation set for reproducibility. |
| Hardware Specification | Yes | The Titan Xp GPU used was donated by the NVIDIA Corporation. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions) that would allow for reproducible setup of the environment. |
| Experiment Setup | No | Detailed experimental settings are given in Appendix A. |