Multivariate Submodular Optimization

Authors: Richard Santiago, F. Bruce Shepherd

ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our focus is on a more general class of multivariate submodular optimization (MVSO) problems: min / max f(S1, S2, . . . , Sk) : S1 S2 Sk F. For maximization, we show that practical algorithms such as accelerated greedy variants and distributed algorithms achieve good approximation guarantees for very general families (such as matroids and p-systems). For arbitrary families, we show that monotone (resp. nonmonotone) MVSO admits an α(1 1/e) (resp. α 0.385) approximation whenever monotone (resp. nonmonotone) SO admits an α-approximation over the multilinear formulation. This substantially expands the family of tractable models. On the minimization side we give essentially optimal approximations in terms of the curvature of f.
Researcher Affiliation Academia 1School of Computer Science, Mc Gill University, Montreal, Canada 2Department of Computer Science, University of British Columbia, Vancouver, Canada. Correspondence to: Richard Santiago <richard.santiagotorres@mail.mcgill.ca>.
Pseudocode No The paper is theoretical and describes algorithms conceptually but does not include structured pseudocode blocks or figures labeled "Algorithm".
Open Source Code No The paper does not contain any statement about releasing source code or provide a link to a code repository for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments involving datasets, training, or public dataset access.
Dataset Splits No The paper is theoretical and does not conduct experiments, therefore no dataset split information (training, validation, test) is provided.
Hardware Specification No The paper is theoretical and does not report on experiments, so no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not report on experiments or provide specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup, hyperparameters, or training configurations.