Dynamic Regret of Convex and Smooth Functions
Authors: Peng Zhao, Yu-Jie Zhang, Lijun Zhang, Zhi-Hua Zhou
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We investigate online convex optimization in non-stationary environments and choose the dynamic regret as the performance measure, defined as the difference between cumulative loss incurred by the online algorithm and that of any feasible comparator sequence. Specifically, we propose novel online algorithms that are capable of leveraging smoothness and replace the dependence on T in the dynamic regret by problemdependent quantities: the variation in gradients of loss functions, the cumulative loss of the comparator sequence, and the minimum of the previous two terms. All the proofs can be found in the full paper [18]. |
| Researcher Affiliation | Academia | Peng Zhao, Yu-Jie Zhang, Lijun Zhang, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China {zhaop, zhangyj, zhanglj, zhouzh}@lamda.nju.edu.cn |
| Pseudocode | Yes | Algorithm 1 Swordvar: Meta (Variation Hedge) |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code availability for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not conduct experiments on datasets, thus no information on dataset availability is provided. |
| Dataset Splits | No | The paper is theoretical and does not describe experimental validation, so no dataset split information is provided. |
| Hardware Specification | No | The paper is theoretical and does not describe experimental setup or hardware specifications. |
| Software Dependencies | No | The paper is theoretical and does not specify software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe experimental setup details such as hyperparameters or training configurations. |