Why Spectral Normalization Stabilizes GANs: Analysis and Improvements
Authors: Zinan Lin, Vyas Sekar, Giulia Fanti
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Theoretically, we show that BSSN gives better gradient control than SN. Empirically, we demonstrate that it outperforms SN in sample quality and training stability on several benchmark datasets. |
| Researcher Affiliation | Academia | Zinan Lin Carnegie Mellon University Pittsburgh, PA 15213 zinanl@andrew.cmu.edu Vyas Sekar Carnegie Mellon University Pittsburgh, PA 15213 vsekar@andrew.cmu.edu Giulia Fanti Carnegie Mellon University Pittsburgh, PA 15213 gfanti@andrew.cmu.edu |
| Pseudocode | No | The paper describes procedures but does not include a clearly labeled 'Pseudocode' or 'Algorithm' block or figure. |
| Open Source Code | Yes | The code for reproducing the results is at https://github.com/fjxmlzn/BSN. |
| Open Datasets | Yes | More specifically, we conducts experiments on CIFAR10, STL10, Celeb A, and Image Net (ILSVRC2012) |
| Dataset Splits | Yes | All experimental details are attached in Apps. N to S. |
| Hardware Specification | Yes | Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [Yes] See App. T. |
| Software Dependencies | No | The paper refers to its open-source code repository for reproducibility details ('See https://github.com/fjxmlzn/BSN'), but the provided text does not explicitly list specific software dependencies with version numbers. |
| Experiment Setup | Yes | All experimental details are attached in Apps. N to S. |