Optimal Convergence Rates for Agnostic Nyström Kernel Learning

Authors: Jian Li, Yong Liu, Weiping Wang

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we provide a refined generalization analysis for Nystr om approximation in the agnostic setting, where the target regression may be out of the hypothesis space. Specifically, we show Nystr om approximation can still achieve the capacity-dependent optimal rates in the agnostic setting. To this end, we first prove the capacity-dependent optimal guarantees of Nystr om approximation with the standard uniform sampling, which covers both loss functions and applies to some agnostic settings. Then, using data-dependent sampling, for example, leverage scores sampling, we derive the capacity-dependent optimal rates that apply to the whole range of the agnostic setting. To our best knowledge, the capacity-dependent optimality for the whole range of the agnostic setting is first achieved and novel in Nystr om approximation.
Researcher Affiliation Academia 1Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2Gaoling School of Artificial Intelligence, Renmin University of China, Beijing, China 3Beijing Key Laboratory of Big Data Management and Analysis Methods, Beijing, China. Correspondence to: Yong Liu <liuyonggsai@ruc.edu.cn>.
Pseudocode No The paper is highly theoretical, focusing on mathematical derivations and proofs. It does not include any pseudocode or algorithm blocks.
Open Source Code No The paper does not contain any statements about making code open-source or provide links to any code repositories.
Open Datasets No The paper is purely theoretical and does not use or provide access information for any specific public datasets for training or evaluation.
Dataset Splits No The paper is theoretical and does not discuss empirical dataset splits (training, validation, or test).
Hardware Specification No The paper is theoretical and discusses computational complexities in an abstract sense. It does not mention any specific hardware (e.g., GPU/CPU models, memory) used for computations.
Software Dependencies No The paper is theoretical and does not mention any specific software dependencies or version numbers needed to replicate any computational work.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with specific hyperparameters, training configurations, or system-level settings.