On Connected Sublevel Sets in Deep Learning
Authors: Quynh Nguyen
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This paper shows that every sublevel set of the loss function of a class of deep overparameterized neural nets with piecewise linear activation functions is connected and unbounded. This implies that the loss has no bad local valleys and all of its global minima are connected within a unique and potentially very large global valley. |
| Researcher Affiliation | Academia | 1Department of Mathematics and Computer Science, Saarland University, Germany. Correspondence to: Quynh Nguyen <quynh@cs.uni-saarland.de>. |
| Pseudocode | No | The paper focuses on theoretical proofs and does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not mention the release of any source code. |
| Open Datasets | No | The paper is purely theoretical and does not mention any specific datasets used for training or experimentation, nor does it provide access information for any dataset. |
| Dataset Splits | No | The paper is purely theoretical and does not discuss dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running experiments. |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations. |