Dynamics of Deep Neural Networks and Neural Tangent Hierarchy
Authors: Jiaoyang Huang, Horng-Tzer Yau
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study the dynamic of neural networks of finite width and derive an infinite hierarchy of differential equations, the neural tangent hierarchy (NTH). We prove that the NTH hierarchy truncated at the level p 2 approximates the dynamic of the NTK up to arbitrary precision under certain conditions on the neural network width and the data set dimension. |
| Researcher Affiliation | Academia | 1School of Mathematics, IAS, Princeton, NJ, USA 2Mathematics Department, Harvard, Cambridge, MA, USA. |
| Pseudocode | No | No pseudocode or algorithm blocks were found. The paper focuses on mathematical derivations and theoretical framework. |
| Open Source Code | No | No statement about making source code available or links to a code repository were found. |
| Open Datasets | No | The paper refers to "training inputs" and "training data" but does not specify any publicly available datasets by name (e.g., MNIST, ImageNet) nor provides links or citations for dataset access. |
| Dataset Splits | No | The paper does not provide specific details on training, validation, or test dataset splits. It only generally refers to "training inputs". |
| Hardware Specification | No | No specific hardware details (e.g., GPU models, CPU types, memory) used for running experiments were mentioned. The paper is theoretical in nature. |
| Software Dependencies | No | No specific software dependencies with version numbers were mentioned. The paper is theoretical and does not describe implementation details. |
| Experiment Setup | No | No specific experimental setup details, such as hyperparameter values, training configurations, or system-level settings, were provided. The paper describes theoretical models and mathematical analyses, not practical implementations. |