Constant-Expansion Suffices for Compressed Sensing with Generative Priors
Authors: Constantinos Daskalakis, Dhruv Rohatgi, Emmanouil Zampetakis
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our main contributions are mathematical in nature. We establish the notion of pseudo-Lipschitzness, along with a concentration inequality for random pseudo-Lipschitz functions, and random matrices, and we use our results to further the theoretical understanding of the non-convex optimization landscape arising in compressed sensing with deep generative priors. |
| Researcher Affiliation | Academia | Constantinos Daskalakis MIT costis@mit.edu Dhruv Rohatgi MIT drohatgi@mit.edu Manolis Zampetakis MIT mzampet@mit.edu |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | No | The paper is theoretical and does not describe experiments that would involve using a specific publicly available dataset for training. |
| Dataset Splits | No | The paper is theoretical and does not provide specific dataset split information for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not provide specific software dependencies with version numbers needed to replicate experiments. |
| Experiment Setup | No | The paper is theoretical and does not provide specific experimental setup details such as hyperparameter values or training configurations. |