Accelerated Single-Call Methods for Constrained Min-Max Optimization

Authors: Yang Cai, Weiqiang Zheng

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We also provide illustrative numerical experiments in Appendix E.
Researcher Affiliation Academia Yang Cai Yale University yang.cai@yale.edu Weiqiang Zheng Yale University weiqiang.zheng@yale.edu
Pseudocode No The paper describes update rules for algorithms (e.g., Optimistic Gradient (OG) and Accelerated Reflected Gradient (ARG)) in text, but it does not present them in a formalized pseudocode block or algorithm environment.
Open Source Code Yes The code can be found in the Supplementary Material.
Open Datasets No The paper describes a 'Test Problem' used for numerical experiments (Problem 1 in (Malitsky, 2015)) which is a specific mathematical formulation, not a public dataset in the traditional sense of a collection of data samples for training. No explicit access information (link, DOI, citation with authors/year) is provided for a public dataset.
Dataset Splits No The paper describes a 'Test Problem' and experimental setup, but it does not mention any training, validation, or testing splits for a dataset. The evaluation is based on a convergence criterion (residual).
Hardware Specification Yes We run experiments using Python 3.9 on jupyter-notebook, on Mac Book Air (M1, 2020) running mac OS 12.5.1.
Software Dependencies Yes We run experiments using Python 3.9 on jupyter-notebook, on Mac Book Air (M1, 2020) running mac OS 12.5.1.
Experiment Setup Yes We denote η to be the step size and the termination criteria is the residual (operator norm) ||F(zt)|| ε. ... With step size η = 0.4, EG is slower than RG. ... With the optimized step size η = 0.7... With step size η = 0.5, FEG is slower than ARG. With the optimized step size η = 1...