Online Convex Optimization with Continuous Switching Constraint

Authors: Guanghui Wang, Yuanyu Wan, Tianbao Yang, Lijun Zhang

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We first investigate the hardness of the problem, and provide a lower bound... We then develop a simple gradient-based algorithm which enjoys the minimax optimal regret bound. Finally, we show that, for strongly convex functions, the regret bound can be improved... In this section, we present the algorithms and theoretical guarantees for OCO-CSC.
Researcher Affiliation Academia Guanghui Wang1, Yuanyu Wan1,2, Tianbao Yang3, Lijun Zhang1,2, 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China 2Peng Cheng Laboratory, Shenzhen, Guangdong, China 3Department of Computer Science, The University of Iowa, Iowa City, USA
Pseudocode Yes Algorithm 1 Adversary s Policy
Open Source Code No The paper does not provide any explicit statements or links indicating the availability of open-source code for the described methodology.
Open Datasets No This is a theoretical paper and does not involve the use of datasets for training or evaluation.
Dataset Splits No This is a theoretical paper and does not involve the use of datasets or specify any data splitting for validation.
Hardware Specification No As a theoretical paper, it does not describe experimental setup including hardware specifications.
Software Dependencies No As a theoretical paper, it does not describe software dependencies with version numbers.
Experiment Setup No As a theoretical paper, it does not describe an experimental setup with specific hyperparameters or training settings.