Optimal Extragradient-Based Algorithms for Stochastic Variational Inequalities with Separable Structure

Authors: Angela Yuan, Chris Junchi Li, Gauthier Gidel, Michael Jordan, Quanquan Gu, Simon S. Du

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We consider the problem of solving stochastic monotone variational inequalities with a separable structure using a stochastic first-order oracle. Building on standard extragradient for variational inequalities we propose a novel algorithm stochastic accelerated gradient-extragradient (AG-EG) for strongly monotone variational inequalities (VIs). Our approach combines the strengths of extragradient and Nesterov acceleration. By showing that its iterates remain in a bounded domain and applying scheduled restarting, we prove that AG-EG has an optimal convergence rate for strongly monotone VIs.
Researcher Affiliation Academia Department of Computer Science, University of California, Los Angeles{hzyuan, qgu}@cs.ucla.edu Department of Electrical Engineering and Computer Sciences, University of California, Berkeley{junchili, jordan}@cs.berkeley.edu DIRO, Université de Montréal and Milagauthier.gidel@umontreal.ca Department of Statistics, University of California, Berkeley ? Paul G. Allen School of Computer Science and Engineering, University of Washingtonssdu@cs.washington.edu
Pseudocode Yes Algorithm 1 Stochastic Accelerated Gradient-Extra Gradient (AG-EG) Descent-Ascent Algorithm, with Scheduled Restarting
Open Source Code No The paper does not provide a direct link to open-source code or explicitly state that the code for their methodology is released.
Open Datasets No The paper is theoretical and does not describe experiments that use a dataset. Therefore, no public dataset information is provided.
Dataset Splits No The paper is theoretical and does not describe experiments with dataset splits. No information about training, validation, or test splits is provided.
Hardware Specification No The paper is theoretical and focuses on algorithm design and convergence rates. It does not describe any experimental setup or the specific hardware used to run experiments.
Software Dependencies No The paper is theoretical and does not describe any experimental setup or specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and focuses on algorithms and convergence proofs. It does not provide details on experimental setup such as hyperparameters, batch sizes, or training schedules.