Hardness and Algorithms for Robust and Sparse Optimization
Authors: Eric Price, Sandeep Silwal, Samson Zhou
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We explore algorithms and limitations for sparse optimization problems such as sparse linear regression and robust linear regression. |
| Researcher Affiliation | Academia | 1Department of Electrical and Computer Engineering, The University of Texas at Austin. 2Electrical Engineering and Computer Science Department, Massachusetts Institute of Technology. 3Computer Science Department, Carnegie Mellon University. |
| Pseudocode | Yes | Algorithm 1 Sparse Regression Upper Bound |
| Open Source Code | No | The paper does not contain any explicit statement about providing open-source code or a link to a code repository. |
| Open Datasets | No | The paper is theoretical and focuses on algorithms and hardness proofs; it does not describe experiments using datasets or provide access information for any dataset. |
| Dataset Splits | No | The paper is theoretical and does not describe any dataset splits (training, validation, or test). |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention specific software dependencies with version numbers for reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with specific hyperparameters or training configurations. |