How to Fill the Optimum Set? Population Gradient Descent with Harmless Diversity
Authors: Chengyue Gong, Lemeng Wu, Qiang Liu
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate that our method can efficiently generate diverse solutions on multiple applications, e.g. text-to-image generation, text-to-mesh generation, molecular conformation generation and ensemble neural network training. |
| Researcher Affiliation | Academia | 1Department of Computer Science, University of Texas at Austin. |
| Pseudocode | Yes | Algorithm 1 Diversity-aware Gradient Descent (Fsum) |
| Open Source Code | No | The paper links to 'https://github.com/NVlabs/stylegan2-ada-pytorch' in the appendix, which is a third-party tool used in their experiments, not the open-source code for their proposed methodology. There is no explicit statement or link indicating the release of their own source code. |
| Open Datasets | Yes | We use Big GAN for Image Net image generation and Style GAN-v2 for high-resolution image generation. |
| Dataset Splits | No | The paper mentions training and testing on datasets but does not explicitly provide details on how the datasets were split into training, validation, and test sets, nor does it mention a specific validation set split. |
| Hardware Specification | Yes | Hours is measured on a NVIDIA Ge Force RTX3090 GPU. |
| Software Dependencies | No | The paper mentions software components like 'Adam (Kingma & Ba, 2014) optimizer', 'Big GAN', 'Style GAN-v2', and 'Res Net-56 models', but does not provide specific version numbers for any programming languages, libraries, or frameworks used. |
| Experiment Setup | Yes | We adopt gradient descent with a constant learning rate 5 10 4 and 1,000 iterations. |