Revisiting Smoothed Online Learning
Authors: Lijun Zhang, Wei Jiang, Shiyin Lu, Tianbao Yang
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we revisit the problem of smoothed online learning, in which the online learner suffers both a hitting cost and a switching cost, and target two performance metrics: competitive ratio and dynamic regret with switching cost. To bound the competitive ratio, we assume the hitting cost is known to the learner in each round, and investigate the simple idea of balancing the two costs by an optimization problem. Surprisingly, we find that minimizing the hitting cost alone is max(1, 2α)-competitive for α-polyhedral functions and 1 + 4λ-competitive for λ-quadratic growth functions, both of which improve state-of-the-art results significantly. Moreover, when the hitting cost is both convex and λ-quadratic growth, we reduce the competitive ratio to 1 + 2λ by minimizing the weighted sum of the hitting cost and the switching cost. To bound the dynamic regret with switching cost, we follow the standard setting of online convex optimization, in which the hitting cost is convex but hidden from the learner before making predictions. We modify Ader, an existing algorithm designed for dynamic regret, slightly to take into account the switching cost when measuring the performance. The proposed algorithm, named as Smoothed Ader, attains an optimal O(pT(1 + PT )) bound for dynamic regret with switching cost, where PT is the path-length of the comparator sequence. Furthermore, if the hitting cost is accessible in the beginning of each round, we obtain a similar guarantee without the bounded gradient condition, and establish an Ω(pT(1 + PT )) lower bound to confirm the optimality. |
| Researcher Affiliation | Academia | Lijun Zhang1,2, Wei Jiang1, Shiyin Lu1, Tianbao Yang3 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China 2Peng Cheng Laboratory, Shenzhen, Guangdong, China 3Department of Computer Science, The University of Iowa, Iowa City, IA 52242, USA |
| Pseudocode | Yes | Algorithm 1 SAder: Meta-algorithm, Algorithm 2 SAder: Expert-algorithm, Algorithm 3 Lookahead SAder: Meta-algorithm, Algorithm 4 Lookahead SAder: Expert-algorithm |
| Open Source Code | No | The paper does not mention providing open-source code for the methodology described. |
| Open Datasets | No | The paper is theoretical and does not describe or use any datasets for training or empirical evaluation. |
| Dataset Splits | No | The paper is theoretical and does not describe or use any validation dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations. |