Continuous Submodular Maximization: Beyond DR-Submodularity

Authors: Moran Feldman, Amin Karbasi

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called COORDINATE-ASCENT+, achieves a ( e 1 2e 1 ε)-approximation guarantee while performing O(n/ε) iterations...
Researcher Affiliation Academia Moran Feldman Department of Computer Science University of Haifa Haifa 3498838, Israel moranfe@cs.haifa.ac.il Amin Karbasi School of Engineering and Applied Science Yale University New Haven, CT 06520 amin.karbasi@yale.edu
Pseudocode Yes Algorithm 1: COORDINATE-ASCENT (ε)
Open Source Code No The paper does not provide any explicit statement about releasing source code for the described methodology, nor does it include links to a code repository.
Open Datasets No The paper does not conduct empirical studies, use datasets, or describe any training process. Therefore, there is no information about public dataset availability.
Dataset Splits No The paper is theoretical and does not describe experiments or dataset usage, so there is no mention of training, validation, or test dataset splits.
Hardware Specification No The paper focuses on theoretical algorithm design and analysis, and therefore does not specify any hardware used for experiments.
Software Dependencies No The paper is theoretical and focuses on algorithm design and proofs, hence no software dependencies with specific version numbers are mentioned.
Experiment Setup No The paper is theoretical and describes algorithms and their approximation guarantees, but it does not detail any experimental setup, hyperparameters, or system-level training settings.