On the Power of Compressed Sensing with Generative Models
Authors: Akshay Kamath, Eric Price, Sushrut Karmalkar
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We present two results establishing the difficulty and strength of this latter task, showing that existing bounds are tight: First, we provide a lower bound matching the (Bora et al., 2017) upper bound for compressed sensing with L-Lipschitz generative models G which holds even for the more relaxed goal of non-uniform recovery. Second, we show that generative models generalize sparsity as a representation of structure by constructing a Re LU-based neural network with 2 hidden layers and O(n) activations per layer whose range is precisely the set of all k-sparse vectors. |
| Researcher Affiliation | Academia | 1Department of Computer Science, The University of Texas, Austin, Texas. Correspondence to: Akshay Kamath <kamath@cs.utexas.edu>, Sushrut Karmalkar <sushrutk@cs.utexas.edu>, Eric Price <ecprice@cs.utexas.edu>. |
| Pseudocode | No | The paper describes mathematical constructions and proof techniques but does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access to source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not use or reference any publicly available datasets for training or empirical evaluation. |
| Dataset Splits | No | The paper is theoretical and does not discuss dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not mention any hardware specifications used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any specific software dependencies with version numbers for experimental reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not provide details about an experimental setup, hyperparameters, or training configurations. |