On the Consistency of Kernel Methods with Dependent Observations
Authors: Pierre-François Massiani, Sebastian Trimpe, Friedrich Solowjow
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We propose the new notion of empirical weak convergence (EWC) as a general assumption explaining such phenomena for kernel methods. Our main results then establish consistency of SVMs, kernel mean embeddings, and general Hilbert-space valued empirical expectations with EWC data. Our analysis holds for both finite- and infinite-dimensional outputs, as we extend classical results of statistical learning to the latter case. Overall, our results open new classes of processes to statistical learning and can serve as a foundation for a theory of learning beyond i.i.d. and mixing. |
| Researcher Affiliation | Academia | 1Institute for Data Science in Mechanical Engineering, RWTH Aachen University, Aachen, Germany. |
| Pseudocode | No | The paper focuses on theoretical derivations and proofs, and does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not mention releasing any open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not conduct experiments with datasets. |
| Dataset Splits | No | The paper is theoretical and does not conduct experiments with datasets, so no training/validation/test splits are discussed. |
| Hardware Specification | No | The paper is theoretical and does not report on experimental setup, thus no hardware specifications are provided. |
| Software Dependencies | No | The paper is theoretical and does not report on experimental setup, thus no software dependencies are listed. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with hyperparameters or training settings. |