Adaptive Online Learning
Authors: Dylan J. Foster, Alexander Rakhlin, Karthik Sridharan
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We propose a general framework for studying adaptive regret bounds in the online learning setting, subsuming model selection and data-dependent bounds. Given a data- or model-dependent bound we ask, Does there exist some algorithm achieving this bound? We show that modifications to recently introduced sequential complexity measures can be used to answer this question by providing sufficient conditions under which adaptive rates can be achieved. In particular each adaptive rate induces a set of so-called offset complexity measures, and obtaining small upper bounds on these quantities is sufficient to demonstrate achievability. A cornerstone of our analysis technique is the use of one-sided tail inequalities to bound suprema of offset random processes. Our framework recovers and improves a wide variety of adaptive bounds including quantile bounds, second order data-dependent bounds, and small loss bounds. In addition we derive a new type of adaptive bound for online linear optimization based on the spectral norm, as well as a new online PAC-Bayes theorem. |
| Researcher Affiliation | Academia | Dylan J. Foster Cornell University Alexander Rakhlin University of Pennsylvania Karthik Sridharan Cornell University |
| Pseudocode | No | The paper focuses on theoretical derivations and proofs, but does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access to source code for the methodology described. |
| Open Datasets | No | The paper is theoretical and does not mention specific datasets or their public availability for training. |
| Dataset Splits | No | The paper is theoretical and does not describe any experimental validation with dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any experimental setup or the hardware used. |
| Software Dependencies | No | The paper is theoretical and does not mention any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details like hyperparameters or training configurations. |