Modeling Sparse Deviations for Compressed Sensing using Generative Models
Authors: Manik Dhar, Aditya Grover, Stefano Ermon
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, we observe consistent improvements in reconstruction accuracy over competing approaches, especially in the more practical setting of transfer compressed sensing where a generative model for a data-rich, source domain aids sensing on a data-scarce, target domain. and 5. Experimental Evaluation |
| Researcher Affiliation | Academia | 1Computer Science Department, Stanford University, CA, USA. |
| Pseudocode | No | No pseudocode or clearly labeled algorithm blocks were found in the paper. |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We considered the MNIST dataset of handwritten digits (Le Cun et al., 2010) and the OMNIGLOT dataset of handwritten characters (Lake et al., 2015). |
| Dataset Splits | Yes | For VAE training, we used the standard train/held-out splits of both datasets. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions general software or libraries (e.g., 'Tensorflow' is cited in references) but does not provide specific version numbers for software dependencies needed to replicate the experiment. |
| Experiment Setup | Yes | The architecture and other hyperparameter details are given in the Appendix. |