An Adaptive Algorithm for Learning with Unknown Distribution Drift
Authors: Alessio Mazzetto, Eli Upfal
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We develop and analyze a general technique for learning with an unknown distribution drift. Given a sequence of independent observations from the last T steps of a drifting distribution, our algorithm agnostically learns a family of functions with respect to the current distribution at time T. Unlike previous work, our technique does not require prior knowledge about the magnitude of the drift. Instead, the algorithm adapts to the sample data. Without explicitly estimating the drift, the algorithm learns a family of functions with almost the same error as a learning algorithm that knows the magnitude of the drift in advance. Furthermore, since our algorithm adapts to the data, it can guarantee a better learning error than an algorithm that relies on loose bounds on the drift. We demonstrate the application of our technique in two fundamental learning scenarios: binary classification and linear regression. |
| Researcher Affiliation | Academia | Alessio Mazzetto Brown University Eli Upfal Brown University |
| Pseudocode | Yes | Algorithm 1: Adaptive Learning Algorithm under Drift |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code or links to a code repository. |
| Open Datasets | No | The paper discusses theoretical concepts using random variables Z1, ..., ZT and does not specify or provide access information for any publicly available or open datasets used for training. |
| Dataset Splits | No | The paper discusses theoretical error bounds and learning scenarios but does not provide specific training/test/validation dataset splits. It defines 'r' as the number of recent samples for theoretical analysis rather than a data split. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | No | The paper focuses on theoretical analysis and algorithm design and does not contain specific experimental setup details such as hyperparameter values or system-level training settings. |