Benefits of Additive Noise in Composing Classes with Bounded Capacity

Authors: Alireza Fathollah Pour, Hassan Ashtiani

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Preliminary empirical results on MNIST dataset indicate that the amount of noise required to improve over existing uniform bounds can be numerically negligible (i.e., element-wise i.i.d. Gaussian noise with standard deviation 10 240).12
Researcher Affiliation Academia Alireza Fathollah Pour Department of Computing and Software Mc Master University fathola@mcmaster.ca Hassan Ashtiani Department of Computing and Software Mc Master University zokaeiam@mcmaster.ca
Pseudocode No The paper does not contain explicit pseudocode or algorithm blocks. It describes the methods and theoretical derivations in prose and mathematical notation.
Open Source Code Yes 2The source codes are available at https://github.com/fathollahpour/composition_noise
Open Datasets Yes We train fully connected neural networks on MNIST dataset.
Dataset Splits No The paper mentions using train and test data for the MNIST dataset but does not explicitly specify a validation split or its size/percentage for reproducibility.
Hardware Specification Yes The experiments were run on a single NVIDIA 2080 GPU.
Software Dependencies No Appendix I mentions 'Python and PyTorch' but does not specify their version numbers for reproducible software dependencies.
Experiment Setup Yes We use SGD with a learning rate of 0.01 and momentum 0.9. We train for 50 epochs.