Proximal Gradient Descent-Ascent: Variable Convergence under KŁ Geometry
Authors: Ziyi Chen, Yi Zhou, Tengyu Xu, Yingbin Liang
ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This is the first theoretical result on the variable convergence for nonconvex minimax optimization. |
| Researcher Affiliation | Academia | Ziyi Chen, Yi Zhou Department of ECE University of Utah Salt Lake City, UT 84112, USA {u1276972,yi.zhou}@utah.edu. Tengyu Xu, Yingbin Liang Department of ECE The Ohio State University Columbus, OH 43210, USA {xu.3260,liang.889}@osu.edu |
| Pseudocode | Yes | Algorithm 1 Proximal-GDA |
| Open Source Code | No | The paper does not provide any statement or link regarding the release of source code for the methodology described. |
| Open Datasets | No | This is a theoretical paper and does not use or reference any datasets for training. |
| Dataset Splits | No | This is a theoretical paper and does not specify training/validation/test dataset splits. |
| Hardware Specification | No | This is a theoretical paper and does not report on experiments, thus no hardware specifications are provided. |
| Software Dependencies | No | This is a theoretical paper and does not report on experiments or provide specific software dependencies with version numbers. |
| Experiment Setup | No | This is a theoretical paper and does not describe any experimental setup details. |