A Dynamical Central Limit Theorem for Shallow Neural Networks

Authors: Zhengdao Chen, Grant Rotskoff, Joan Bruna, Eric Vanden-Eijnden

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We also complement these results with numerical experiments.
Researcher Affiliation Academia Zhengdao Chen: Department of Chemistry, Stanford University : Grant M. Rotskoff: Courant Institute of Mathematical Sciences, New York University : Joan Bruna: Center for Data Science, New York University : Eric Vanden-Eijnden: New York University
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement or link indicating that the source code for the methodology is openly available.
Open Datasets No The paper describes a "student-teacher experiment" using data generated from a "teacher network" (synthetic data), but does not mention the use of any well-known public datasets or provide a link/citation for accessing the data used in their experiments.
Dataset Splits No The paper does not provide specific details about training, validation, or test dataset splits (e.g., percentages, sample counts, or a detailed splitting methodology).
Hardware Specification No The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types, or memory amounts) used for running its experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers.
Experiment Setup Yes The setup for the numerical experiments is described in Appendix G.1, including details such as the learning rate and initialization: 'All experiments are run with a learning rate of 0.1, and the student neurons are initialized with parameters ci(0) = 1 and zi(0) from a Gaussian distribution with mean 0 and variance 1.'