Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Fast and Private Submodular and $k$-Submodular Functions Maximization with Matroid Constraints
Authors: Akbar Rafiey, Yuichi Yoshida
ICML 2020 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy. We provide (1 - 1/e)-approximation algorithm which improves upon the previous results in terms of approximation guarantee. This is done with an almost cubic number of function evaluations in our algorithm. Moreover, we study k-submodularity, a natural generalization of submodularity. We give the first 1/2-approximation algorithm that preserves differential privacy for maximizing monotone k-submodular functions subject to matroid constraints. The approximation ratio is asymptotically tight and is obtained with an almost linear number of function evaluations. |
| Researcher Affiliation | Academia | 1Department of Computing Science, Simon Fraser University, Burnaby, Canada 2National Institute of Informatics, Tokyo, Japan. Correspondence to: Akbar Rafiey <arafiey@sfu.ca>, Yuichi Yoshida <EMAIL>. |
| Pseudocode | Yes | Algorithm 1 Differentially Private Continuous Greedy; Algorithm 2 Improved Differentially Private Continuous Greedy Algorithm; Algorithm 3 Differentially private k-submodular maximization with a matroid constraint; Algorithm 4 Improved differentially private k-submodular maximization with a matroid constraint |
| Open Source Code | No | The paper does not provide any statement about releasing source code or a link to a code repository. |
| Open Datasets | No | The paper refers to 'sensitive dataset D' as an input to the functions being maximized, but it does not specify or provide access information for any concrete, publicly available dataset used for training or evaluation. The examples like 'medical data, web search query data, salary data, social networks' are general types of data, not specific datasets with access details. |
| Dataset Splits | No | The paper is theoretical and does not conduct empirical experiments, therefore it does not provide information about validation splits. |
| Hardware Specification | No | The paper does not mention any specific hardware used for running experiments. It focuses on theoretical algorithms and their complexity. |
| Software Dependencies | No | The paper does not provide specific software names with version numbers (e.g., programming languages, libraries, or solvers) required to replicate the experiments or implement the algorithms. |
| Experiment Setup | No | The paper describes algorithms theoretically and does not include an experimental setup section with concrete hyperparameters, training configurations, or system-level settings. |