Stability and Deviation Optimal Risk Bounds with Convergence Rate $O(1/n)$

Authors: Yegor Klochkov, Nikita Zhivotovskiy

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical The paper is entirely focused on presenting and proving mathematical theorems and bounds related to generalization error and excess risk in machine learning algorithms, particularly uniform stability. It contains sections like 'Main results' with Theorem 1.1 and 1.2, and a dedicated 'Proofs' section. There is no mention of experiments, datasets, empirical evaluation, or performance metrics.
Researcher Affiliation Academia Yegor Klochkov Cambridge-INET, Faculty of Economics University of Cambridge yk376@cam.ac.uk Nikita Zhivotovskiy Department of Mathematics ETH, Zürich nikita.zhivotovskii@math.ethz.ch
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks. It primarily presents mathematical definitions, theorems, and proofs.
Open Source Code No The paper does not provide any statement or link indicating the availability of open-source code for the methodology described.
Open Datasets No The paper is theoretical and does not involve experiments with datasets. Therefore, no information regarding publicly available or open datasets for training is provided.
Dataset Splits No The paper is theoretical and does not involve experiments with datasets. Therefore, no information regarding training/test/validation dataset splits is provided.
Hardware Specification No The paper is theoretical and does not report on experiments. Therefore, no hardware specifications used for running experiments are mentioned.
Software Dependencies No The paper is theoretical and does not report on experiments or specific implementations. Therefore, no specific software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and focuses on mathematical proofs. It does not describe any practical experiments or their setup, including hyperparameters or system-level training settings.