Fully Dynamic Submodular Maximization over Matroids
Authors: Paul Duetting, Federico Fusco, Silvio Lattanzi, Ashkan Norouzi-Fard, Morteza Zadimoghaddam
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our main result is a randomized algorithm that maintains an efficient data structure with an O(k2) amortized update time (in the number of additions and deletions) and yields a 4-approximate solution, where k is the rank of the matroid. |
| Researcher Affiliation | Collaboration | 1Google Research 2Department of Computer, Control and Management Engineering Antonio Ruberti , Sapienza University of Rome, Rome, Italy. |
| Pseudocode | Yes | Algorithm 1 SWAPPING; Algorithm 2 INITIALIZATION; Algorithm 3 INSERTION(e); Algorithm 4 DELETION(e); Algorithm 5 LEVEL-CONSTRUCT(ℓ); Algorithm 6 THRESHOLD-SWAPPING; Algorithm 7 INSERTION(e); Algorithm 8 LEVEL-CONSTRUCT(ℓ) |
| Open Source Code | No | The paper does not provide any statement or link indicating the availability of open-source code for the methodology described. |
| Open Datasets | No | The paper is theoretical and focuses on algorithm design and analysis, not empirical evaluation on datasets. Therefore, no datasets are mentioned as publicly available. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments with datasets, thus no training, validation, or test splits are mentioned. |
| Hardware Specification | No | The paper is theoretical and focuses on algorithm design and analysis, not empirical evaluation. No hardware specifications for running experiments are mentioned. |
| Software Dependencies | No | The paper is theoretical and focuses on algorithm design and analysis, not empirical evaluation. No software dependencies with version numbers are mentioned. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm design and analysis, not empirical evaluation. No experimental setup details, such as hyperparameters or system-level training settings, are provided. |