Individualized Privacy Accounting via Subsampling with Applications in Combinatorial Optimization

Authors: Badih Ghazi, Pritish Kamath, Ravi Kumar, Pasin Manurangsi, Adam Sealfon

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, we give a new technique for analyzing individualized privacy accounting via the following simple observation: if an algorithm is one-sided add-DP, then its subsampled variant satisfies two-sided DP. From this, we obtain several improved algorithms for private combinatorial optimization problems, including decomposable submodular maximization and set cover. Our error guarantees are asymptotically tight and our algorithm satisfies pure-DP while previously known algorithms (Gupta et al., 2010; Chaturvedi et al., 2021) are approximate-DP. We also show an application of our technique beyond combinatorial optimization by giving a pure-DP algorithm for the shifting heavy hitter problem in a stream; previously, only an approximate-DP algorithm was known (Kaplan et al., 2021; Cohen & Lyu, 2023).
Researcher Affiliation Industry 1Google Research, Mountain View, USA 2Google Research, Thailand 3Google Research, New York, USA. Correspondence to: Pasin Manurangsi <pasin@google.com>.
Pseudocode Yes Algorithm 1 REPEATED-EMε0,A (Gupta et al., 2010) Algorithm 2 REPEATED-ATε0,A Algorithm 3 THRESHMONITORε0,τ,k Algorithm 4 DPGREEDYSCALINGε Algorithm 5 Priv Cont Greedyε0,η,s (Chaturvedi et al., 2021)
Open Source Code No The paper does not provide any statement or link indicating the availability of open-source code for the described methodology.
Open Datasets No The paper is theoretical and does not describe experiments using specific datasets, nor does it provide access information for any datasets.
Dataset Splits No The paper is theoretical and does not describe empirical experiments with data splits for training, validation, or testing.
Hardware Specification No The paper is purely theoretical and does not mention any hardware specifications used for experiments.
Software Dependencies No The paper is purely theoretical and does not mention specific software dependencies with version numbers.
Experiment Setup No The paper is purely theoretical and does not describe an experimental setup, hyperparameters, or training configurations.