Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Sparse and Smooth Signal Estimation: Convexification of L0-Formulations
Authors: Alper Atamturk, Andres Gomez, Shaoning Han
JMLR 2021 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we present experiments with utilizing the strong convex relaxations based on the pairwise convexification methods proposed in the paper. In Section 5.1, we perform experiments to evaluate whether the convex model decomp provides a good approximation to the non-convex problem (11). In Section 5.2 we test the merits of formulation decomp (with a variety of constraints C) compared to the usual ℓ1-approximation from an inference perspective. Finally, in Section 5.3 we test the Lagrangian relaxation-based method proposed in Section 4.3. |
| Researcher Affiliation | Academia | Alper Atamt urk EMAIL Department of Industrial Engineering & Operations Research University of California Berkeley, CA 94720, USA Andres G omez EMAIL Daniel J. Epstein Department of Industrial & Systems Engineering University of Southern California Los Angeles, CA 90089, USA Shaoning Han EMAIL Daniel J. Epstein Department of Industrial & Systems Engineering University of Southern California Los Angeles, CA 90089, USA |
| Pseudocode | Yes | Algorithm 1 Algorithm to solve formulation decomp |
| Open Source Code | Yes | All data and code used in the computations are available at https://sites.google.com/usc.edu/gomez/data. |
| Open Datasets | Yes | Consider the accelerometer data depicted in Figure 2 (A), used in Casale et al. (2011, 2012) and downloaded from the UCI Machine Learning Repository Dheeru and Karra Taniskidou (2017). |
| Dataset Splits | No | For each parameter combination, two signals are randomly generated: one signal for training, the other for testing. |
| Hardware Specification | Yes | All computations are performed on a laptop with eight Intel(R) Core(TM) i7-8550 CPUs and 16GB RAM. All data and code used in the computations are available at https://sites.google.com/usc.edu/gomez/data. |
| Software Dependencies | Yes | We use Mosek 8.1.0 (with default settings) to solve the conic quadratic optimization problems. ... Specifically, we use the perspective reformulation of (41), i.e., ... and solve the problems using Gurobi 8.0 with a one hour time limit. |
| Experiment Setup | Yes | We test the convex formulations with the accelerometer data using λ = 0.1t and k = 500t for t = 1, . . . , 10 for all 100 combinations. ... We consider two criteria for choosing a pair (λ, µ): ... The algorithm is terminated when the relative improvement of the relaxation (ζnew ζold) /ζnew 5 10 5. ... We terminate the algorithm when ξ(γh) < ϵ (ϵ = 10 3 in our computations) or when the number of iterations reaches hmax (hmax = 100 in our computations). |