Generalization bounds for deep convolutional neural networks
Authors: Philip M. Long, Hanie Sedghi
ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We present experiments using CIFAR-10 with varying hyperparameters of a deep convolutional network, comparing our bounds with practical generalization gaps. |
| Researcher Affiliation | Industry | Philip M. Long and Hanie Sedghi Google Brain {plong,hsedghi}@google.com |
| Pseudocode | No | The paper contains mathematical derivations and proofs but no pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statements or links indicating that open-source code for the methodology is available. |
| Open Datasets | Yes | We present experiments using CIFAR-10 with varying hyperparameters of a deep convolutional network, comparing our bounds with practical generalization gaps. |
| Dataset Splits | No | The paper mentions training and testing data but does not explicitly describe a separate validation set split or its parameters. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running experiments. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software dependencies or libraries used. |
| Experiment Setup | Yes | The network was trained with dropout regularization and an exponential learning rate schedule. We define the generalization gap as the difference between train error and test error. In order to analyze the effect of the number of network parameters on the generalization gap, we scaled up the number of channels in each layer, while keeping other elements of the architecture, including the depth, fixed. Each network was trained repeatedly, sweeping over different values of the initial learning rate and batch sizes 32, 64, 128. For each setting the results were averaged over five different random initializations. |