Learning Compressed Transforms with Low Displacement Rank
Authors: Anna Thomas, Albert Gu, Tri Dao, Atri Rudra, Christopher Ré
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 5 Empirical evaluation. Overview In Section 5.1 we consider a standard setting of compressing a single hidden layer (SHL) neural network and the fully-connected (FC) layer of a CNN for image classification tasks. Following previous work [7, 45], we test on two challenging MNIST variants [30], and include two additional datasets with more realistic objects (CIFAR-10 [29] and NORB [32]). |
| Researcher Affiliation | Academia | Department of Computer Science, Stanford University Department of Computer Science and Engineering, University at Buffalo, SUNY |
| Pseudocode | No | The paper describes algorithms and refers to Appendix C for more complete descriptions, but does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at https://github.com/Hazy Research/structured-nets. |
| Open Datasets | Yes | we test on two challenging MNIST variants [30], and include two additional datasets with more realistic objects (CIFAR-10 [29] and NORB [32]). |
| Dataset Splits | No | The paper mentions training and test data, and refers to 'standard split' for some datasets, but does not provide explicit percentages or counts for training, validation, and test splits used in its experiments. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions 'implemented in Py Torch' but does not specify version numbers for PyTorch or any other software dependencies. |
| Experiment Setup | No | The paper states 'Appendix F includes more experimental details and protocols', indicating that specific setup details like hyperparameters are not present in the main text. |