Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

The Implicit Regularization of Momentum Gradient Descent in Overparametrized Models

Authors: Li Wang, Zhiguo Fu, Yingcong Zhou, Zili Yan

AAAI 2023 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The numerical experiments strongly support our theoretical results.
Researcher Affiliation Academia 1School of Computer Science and Information Technology & KLAS, Northeast Normal University, China 2School of Mathematics and Statistics, Beihua University, China
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide concrete access to source code for the methodology described.
Open Datasets No The paper mentions generating features for experiments (e.g., "We generate features via X = Σ1/2Z...") rather than using a publicly available dataset with concrete access information.
Dataset Splits No The paper does not provide specific dataset split information for training, validation, or testing.
Hardware Specification No The paper does not provide specific hardware details used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers.
Experiment Setup Yes Fix the same sufficiently small initialization w0 = 0.01 and step size ϵ = 0.001, we plot the path of w for different N = 1, 2, 3, 4.