Hardness of Learning Neural Networks under the Manifold Hypothesis

Authors: Bobak Kiani, Jason Wang, Melanie Weber

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Looking forward, we comment on and empirically explore intermediate regimes of manifolds, which have heterogeneous features commonly found in real world data.4
Researcher Affiliation Academia John A. Paulson School of Engineering and Applied Sciences, Harvard University; e-mail: bkiani@g.harvard.edu. Harvard College, Harvard University; e-mail: jasonwang1@college.harvard.edu. John A. Paulson School of Engineering and Applied Sciences, Harvard University; e-mail: mweber@g.harvard.edu
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks with explicit labels like 'Algorithm' or 'Pseudocode'.
Open Source Code Yes Code is made public at https://github.com/Weber-Geo ML/manifold-learning-complexity
Open Datasets Yes We investigate synthetic unit hyperspheres to sanity check our method and proceed to three realworld image datasets: MNIST [35], Fashion MNIST (FMNIST) [95], and Kuzushiji-MNIST (KMNIST) [29].
Dataset Splits No The paper mentions 'a training set of size 1000' and '10000 samples' but does not specify explicit training/validation/test splits with percentages or counts for reproduction.
Hardware Specification Yes This training was done on an NVIDIA L4 24GB GPU.
Software Dependencies No We use the Pytorch package for our neural network experiments [73]. No specific version number for PyTorch or other software dependencies is provided.
Experiment Setup Yes For Euclidean datasets, we train a fully-connected 5-layer Re LU network of width 2048 on 100000 samples for 30 epochs, batch size 64, and a learning rate of 4 × 10−4.