Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Neural Hilbert Ladders: Multi-Layer Neural Networks in Function Space

Authors: Zhengdao Chen

JMLR 2024 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we perform numerical experiments to illustrate the feature learning aspect of NN training through the lens of NHLs.
Researcher Affiliation Industry Zhengdao Chen EMAIL Google Research Mountain View, CA 94043
Pseudocode No The paper describes mathematical frameworks and theoretical properties, but does not include any explicitly labeled pseudocode or algorithm blocks. Procedures are described in natural language and mathematical notation.
Open Source Code No The paper does not contain any explicit statement about releasing source code for the described methodology, nor does it provide a link to a code repository. The license information provided is for the paper itself, not for accompanying code.
Open Datasets No The paper describes generating synthetic datasets for its numerical experiments: "We choose d = 10, n = 50 and ν = N(0, Id)." and "We choose d = 1, n = 20, m = 512, the target function being f*(x) = sin(2x), and ν being the uniform distribution on [0, 2π]." This is a description of data generation, not concrete access information for a publicly available dataset.
Dataset Splits No The paper uses synthetic data for numerical illustrations, specifying the total number of samples (n = 50 or n = 20) for these illustrations. However, it does not explicitly describe any training/validation/test splits for these samples, as the focus is on illustrating theoretical dynamics rather than performance evaluation on separate data partitions.
Hardware Specification No The paper provides "Numerical Illustrations" but does not specify any hardware details (e.g., GPU models, CPU types, memory) used to run these experiments.
Software Dependencies No The paper does not mention any specific software dependencies or their version numbers (e.g., programming languages, libraries, or frameworks like Python, PyTorch, TensorFlow, etc.) that would be necessary to replicate the experimental results.
Experiment Setup Yes Experiment 1: Linear NN: "3-layer linear NNs trained by GD with width 64 and 8192." "We assume for simplicity that ρa has zero mean and unit variance while ρz has the zero vector as its mean and the identity matrix as its covariance." Experiment 2: Re LU NN: "3-layer NNs with the Re LU activation on a least squares regression task." "All parameters in the model, including untrained bias terms, are sampled i.i.d. from N(0, 1) at initialization." "target function being f*(x) = sin(2x)"