Optimal approximation for unconstrained non-submodular minimization

Authors: Marwa El Halabi, Stefanie Jegelka

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We empirically validate our results on noisy submodular minimization and structured sparse learning. In particular, we address the following questions: (1) How robust are different submodular minimization algorithms, including PGM, to multiplicative noise? (2) How well can PGM minimize a non-submodular objective? Do the parameters (α, β) accurately characterize its performance?
Researcher Affiliation Academia 1Massachusetts Institute of Technology.
Pseudocode No The paper describes the algorithms and methods used (e.g., PGM), but it does not include a formally labeled 'Pseudocode' or 'Algorithm' block.
Open Source Code Yes All experiments were implemented in Matlab, and con ducted on cluster nodes with 16 Intel Xeon E5 CPU cores and 64 GB RAM. Source code is available at https: //github.com/marwash25/non-sub-min.
Open Datasets Yes We stopped each algorithm after 1000 iterations for the first dataset and after 400 iterations for the second one, or until the approximate duality gap reached 10 8 . To compute the optimal value H(S ), we use MNP with the noise-free oracle H. We refer the reader to (Bach, 2013, Sect. 12.1) for more details about the algorithms and datasets.
Dataset Splits No The paper discusses running algorithms for a certain number of iterations or until a duality gap is reached, and averaging results over multiple runs, but it does not specify explicit training, validation, or test dataset splits. For example, it says: 'We stopped each algorithm after 1000 iterations for the first dataset and after 400 iterations for the second one, or until the approximate duality gap reached 10 8'.
Hardware Specification Yes All experiments were implemented in Matlab, and con ducted on cluster nodes with 16 Intel Xeon E5 CPU cores and 64 GB RAM.
Software Dependencies No The paper states, 'All experiments were implemented in Matlab', but it does not provide a specific version number for Matlab or any other software libraries or dependencies used.
Experiment Setup Yes We stopped each algorithm after 1000 iterations for the first dataset and after 400 iterations for the second one, or until the approximate duality gap reached 10 8. We set d = 250, k = 20 and vary the number of measurements n between d/4 and 2d. We compare the solutions obtained by minimizing the least squares loss (x) = ky Axk2 with 2 2 the three regularizers: The range function F r, where H is optimized via exhaustive search (OPT-Range), or via PGM (PGM-Range);... λ was varied between 10 4 and 10. Results are averaged over 5 runs.