Non-Asymptotic Uniform Rates of Consistency for k-NN Regression
Authors: Heinrich Jiang3999-4006
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We derive high-probability finite-sample uniform rates of consistency for k-NN regression that are optimal up to logarithmic factors under mild assumptions. We moreover show that k-NN regression adapts to an unknown lower intrinsic dimension automatically in the sup-norm. We then apply the k-NN regression rates to establish new results about estimating the level sets and global maxima of a function from noisy observations. |
| Researcher Affiliation | Industry | Heinrich Jiang Google Research Mountain View, CA |
| Pseudocode | No | The paper is theoretical and focuses on mathematical proofs and theorems. It does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link indicating the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper describes theoretical results for k-NN regression based on samples drawn from a density, but it does not mention or use any specific publicly available datasets for experiments. |
| Dataset Splits | No | The paper is theoretical and does not describe any experimental setup with training, validation, or test dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not mention any hardware specifications used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies or versions. |
| Experiment Setup | No | The paper is theoretical and focuses on mathematical derivations and proofs. It does not describe any experimental setup details such as hyperparameters or training configurations. |