Revisiting Spatial Invariance with Low-Rank Local Connectivity
Authors: Gamaleldin Elsayed, Prajit Ramachandran, Jonathon Shlens, Simon Kornblith
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We performed classification experiments on MNIST, CIFAR-10, and Celeb A datasets. We trained our models without data augmentation or regularization to focus our investigation on the pure effects of the degree of spatial invariance on generalization. |
| Researcher Affiliation | Industry | 1Google Research, Brain Team. |
| Pseudocode | Yes | Algorithm 1 Low Rank Locally Connected Layer |
| Open Source Code | Yes | Code is available at github.com/google-research/googleresearch/tree/master/low_rank_local_connectivity. |
| Open Datasets | Yes | We performed classification experiments on MNIST, CIFAR-10, and Celeb A datasets. |
| Dataset Splits | Yes | Our division of training, validation and test subsets are shown in Table Supp.1." and "The optimal rank in LRLC is obtained by evaluating models on a separate validation subset. |
| Hardware Specification | Yes | We used Tensor Processing Unit (TPU) accelerators in all our training. |
| Software Dependencies | No | The paper mentions the use of the Adam optimizer and batch normalization but does not provide specific version numbers for any software dependencies like deep learning frameworks (e.g., TensorFlow, PyTorch) or other libraries. |
| Experiment Setup | Yes | In our experiments, we used the Adam optimizer with a maximum learning rate of 0.01 and a minibatch size of 512. We trained our models for 150 epochs starting with a linear warmup period of 10 epochs and used a cosine decay schedule afterwards. |