Nonparametric Online Regression while Learning the Metric
Authors: Ilja Kuzborskij, Nicolò Cesa-Bianchi
NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study algorithms for online nonparametric regression that learn the directions along which the regression function is smoother. Our algorithm learns the Mahalanobis metric based on the gradient outer product matrix G of the regression function (automatically adapting to the effective rank of this matrix), while simultaneously bounding the regret on the same data sequence in terms of the spectrum of G. As a preliminary step in our analysis, we extend a nonparametric online learning algorithm by Hazan and Megiddo enabling it to compete against functions whose Lipschitzness is measured with respect to an arbitrary Mahalanobis metric. |
| Researcher Affiliation | Academia | Ilja Kuzborskij EPFL Switzerland ilja.kuzborskij@gmail.comNicol o Cesa-Bianchi Dipartimento di Informatica Universit a degli Studi di Milano Milano 20135, Italy nicolo.cesa-bianchi@unimi.it |
| Pseudocode | Yes | Algorithm 1 Nonparametric online regression |
| Open Source Code | No | The paper does not provide any specific links or statements about the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper focuses on theoretical analysis and does not specify the use of any particular public or open dataset for training. It discusses a theoretical setup where 'instances xt are realizations of i.i.d. random variables Xt drawn according to some fixed and unknown distribution µ'. |
| Dataset Splits | No | The paper does not provide specific details on train/validation/test dataset splits, as it is a theoretical work. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used for experiments. |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not include concrete details about an experimental setup, such as hyperparameter values or training configurations. |