Projection-Free Online Convex Optimization via Efficient Newton Iterations

Authors: Khashayar Gatmiry, Zak Mhammedi

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper presents new projection-free algorithms for Online Convex Optimization (OCO) over a convex domain K Rd... As our main contribution, we show how the stability of the Newton iterates can be leveraged to only compute the inverse Hessian a vanishing fractions of the rounds, leading to a new efficient projection-free OCO algorithm with a state-of-the-art regret bound.
Researcher Affiliation Academia Khashayar Gatmiry MIT gatmiry@mit.com Zakaria Mhammedi MIT mhammedi@mit.edu
Pseudocode Yes Algorithm 1 BARONS: Barrier-Regularized Online Newton Step
Open Source Code No No statements or links regarding open-source code for the described methodology are provided in the paper.
Open Datasets No The paper is a theoretical work focusing on algorithm design and regret analysis, and does not conduct empirical studies or use datasets for training.
Dataset Splits No The paper is a theoretical work and does not perform experiments, thus it does not describe any dataset splits (training, validation, test).
Hardware Specification No The paper is a theoretical work and does not describe any specific hardware used for running experiments.
Software Dependencies No The paper focuses on theoretical analysis and algorithm design, and does not mention any specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with specific hyperparameter values or training configurations.