Quantitative Propagation of Chaos for SGD in Wide Neural Networks
Authors: Valentin De Bortoli, Alain Durmus, Xavier Fontaine, Umut Simsekli
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We perform various experiments on real datasets to validate our theoretical results, assessing the existence of these two regimes on classification problems and illustrating our convergence results. |
| Researcher Affiliation | Academia | Valentin De Bortoli University of Oxford debortoli@stats.ox.ac.uk Alain Durmus Université Paris-Saclay alain.durmus@cmla.ens-cachan.fr Xavier Fontaine Université Paris-Saclay fontaine@cmla.ens-cachan.fr Umut Sim sekli LTCI, Télécom Paris, Institut Polytechnique de Paris umut.simsekli@telecom-paris.fr |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any links to open-source code or state that code will be made available. |
| Open Datasets | Yes | We focus on the classification task for two datasets: MNIST [41] and CIFAR-10 [42]. |
| Dataset Splits | No | The paper mentions "training and test accuracies" but does not explicitly specify validation dataset splits. |
| Hardware Specification | No | The paper does not explicitly describe the hardware used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | If not specified, we set α = 0, M = 100, T = 100, γ = 1. ... We consider the following set of parameters α = 0, M = 100, T = 10000, γ = 0.1. |