Gradient-Variation Bound for Online Convex Optimization with Constraints

Authors: Shuang Qiu, Xiaohan Wei, Mladen Kolar

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we provide an instance-dependent bound for online convex optimization with complex constraints obtained by a novel online primal-dual mirror-prox algorithm. Our instance-dependent regret is quantified by the total gradient variation V(T) in the sequence of loss functions. The proposed algorithm works in general normed spaces and simultaneously achieves an O( p V(T)) regret and an O(1) constraint violation, which is never worse than the best-known (O( T), O(1)) result and improves over previous works that applied mirror-prox-type algorithms for this problem achieving O(T 2/3) regret and constraint violation. Finally, our algorithm is computationally efficient, as it only performs mirror descent steps in each iteration instead of solving a general Lagrangian minimization problem. ... We propose a novel online primal-dual mirror-prox method and prove a strong theoretical result that it can achieve a gradient-variation regret bound and simultaneously maintain the O(1) constraint violation in a general normed space.
Researcher Affiliation Collaboration 1 Booth School of Business, the University of Chicago 2Meta Platforms, Inc.
Pseudocode Yes Algorithm 1: Online Primal-Dual Mirror-Prox Algorithm ... Algorithm 2: Online Primal-Dual Mirror-Prox Algorithm on Probability Simplex
Open Source Code No The paper does not contain any explicit statements or links indicating that source code for the described methodology is publicly available.
Open Datasets No The paper is theoretical and focuses on algorithm design and proofs. It does not describe any empirical studies or experiments that would involve training on a dataset.
Dataset Splits No The paper is theoretical and focuses on algorithm design and proofs. It does not describe any empirical studies or experiments that would involve dataset splits for validation.
Hardware Specification No The paper is theoretical and focuses on algorithm design and proofs. It does not report any experimental results, and therefore, no hardware specifications for running experiments are mentioned.
Software Dependencies No The paper is theoretical and focuses on algorithm design and proofs. It does not report any experimental results, and therefore, no specific software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and focuses on algorithm design and proofs. It does not describe any empirical experiments, and therefore, no experimental setup details like hyperparameters or training settings are provided.