Global Optimality Conditions for Deep Neural Networks
Authors: Chulhee Yun, Suvrit Sra, Ali Jadbabaie
ICLR 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | For deep linear networks, we present necessary and sufficient conditions for a critical point of the risk function to be a global minimum. Surprisingly, our conditions provide an efficiently checkable test for global optimality, while such tests are typically intractable in nonconvex optimization. We further extend these results to deep nonlinear neural networks and prove similar sufficient conditions for global optimality, albeit in a more limited function space setting. and In this section, we provide proofs for Theorems 2.1 and 2.2. |
| Researcher Affiliation | Academia | Chulhee Yun, Suvrit Sra & Ali Jadbabaie Massachusetts Institute of Technology Cambridge, MA 02139, USA {chulheey,suvrit,jadbabai}@mit.edu |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. |
| Open Datasets | No | The paper formulates a theoretical problem involving data matrices X and Y but does not specify a concrete, publicly available dataset used for empirical training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not involve dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper does not provide specific hardware details used for running experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers. |
| Experiment Setup | No | The paper does not contain specific experimental setup details such as hyperparameter values or training configurations. |