Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Generalization of noisy SGD in unbounded non-convex settings

Authors: Leello Tadesse Dadi, Volkan Cevher

ICML 2025 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Formally, we establish time-independent information theoretic generalization bounds for Stochastic Gradient Langevin Dynamics (SGLD) that do not diverge as the iteration count increases. Our bounds are obtained through a stability argument: we analyze the difference between two SGLD sequences ran in parallel on two datasets sampled from the same distribution. Our result only requires an isoperimetric inequality to hold, which is merely a restriction on the tails of the loss.
Researcher Affiliation Academia 1EPFL, Lausanne, Switzerland. Correspondence to: Leello Dadi <EMAIL>.
Pseudocode No The paper describes the SGLD recursion as "Xk+1 = Xk ηg(Xk, Bk) + r2η β Nk+1 (SGLD)" but does not present a formally labeled pseudocode or algorithm block.
Open Source Code No The paper does not contain any explicit statement or link regarding the availability of open-source code for the described methodology.
Open Datasets No The paper defines the empirical approximation Fn given by Fn(x, D) = 1 n Pn i=1 f(x, Zi) based on a dataset D of n independent, identically distributed samples D = Z1, . . . , Zn from ν, but does not use or provide access to any specific dataset for experiments.
Dataset Splits No The paper is theoretical and does not conduct experiments with specific datasets, therefore it does not mention training/test/validation dataset splits.
Hardware Specification No The paper is theoretical and does not describe any experimental setup or hardware used for running experiments.
Software Dependencies No The paper is theoretical and focuses on mathematical analysis, thus it does not specify any software dependencies or versions.
Experiment Setup No The paper focuses on theoretical analysis and derivation of generalization bounds, and therefore does not include details about an experimental setup, hyperparameters, or system-level training settings.