Parallel Submodular Function Minimization

Authors: Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We consider the parallel complexity of submodular function minimization (SFM). We provide a pair of methods which obtain two new query versus depth tradeoffs a submodular function defined on subsets of n elements that has integer values between M and M. The first method has depth 2 and query complexity n O(M) and the second method has depth e O(n1/3M 2/3) and query complexity O(poly(n, M)).
Researcher Affiliation Collaboration Deeparnab Chakrabarty Dartmouth College Hanover, USA deeparnab@dartmouth.edu Andrei Graur Stanford University Stanford, USA agraur@stanford.edu Haotian Jiang Microsoft Research Redmond, USA jhtdavid96@gmail.com Aaron Sidford Stanford University Stanford, USA sidford@stanford.edu
Pseudocode Yes Algorithm 1: Augmenting Sets Algorithm
Open Source Code No The paper is theoretical and focuses on algorithm design and complexity analysis. It does not mention providing open-source code for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments on datasets, thus no information about publicly available or open datasets is provided.
Dataset Splits No The paper is theoretical and does not conduct experiments on datasets, thus no information about training/validation/test dataset splits is provided.
Hardware Specification No The paper is theoretical and does not describe experimental setups or hardware used for computation.
Software Dependencies No The paper is theoretical and does not describe experimental implementations or software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not conduct experiments, therefore no experimental setup details like hyperparameters or training configurations are provided.