Exponential Generalization Bounds with Near-Optimal Rates for $L_q$-Stable Algorithms

Authors: Xiaotong Yuan, Ping Li

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical EXPONENTIAL GENERALIZATION BOUNDS WITH NEAR-OPTIMAL RATES FOR Lq-STABLE ALGORITHMS; In Section 2, we first establish in Theorem 1 an Lq-norm inequality for sums of functions of random variables with Lq-norm bounded difference. Then equipped with such a generalpurpose concentration inequality, we prove in Theorem 2 the following Lq-norm generalization bound for Lq-stable learning algorithms for all q >= 2
Researcher Affiliation Collaboration Xiao-Tong Yuan School of Intelligence Science and Technology Nanjing University, Suzhou, 215163, China xtyuan1980@gmail.com Ping Li Linked In Ads 700 Bellevue Way NE, Bellevue, WA 98004, USA pinli@linkedin.com
Pseudocode Yes Algorithm 1: Inexact L0-ERM Oracle Input : A training data set S = {Zi}i [N] and the desired sparsity level k. Output: w S,k.
Open Source Code No The paper does not contain any statements or links indicating that the source code for the described methodology is publicly available.
Open Datasets No The paper does not describe empirical experiments using a specific dataset for training, validation, or testing.
Dataset Splits No The paper does not describe empirical experiments using a specific dataset for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe empirical experiments, therefore no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not describe empirical experiments, therefore no software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and does not describe empirical experiments, therefore no experimental setup details like hyperparameters or training configurations are provided.