User-Level Differential Privacy With Few Examples Per User

Authors: Badih Ghazi, Pritish Kamath, Ravi Kumar, Pasin Manurangsi, Raghu Meka, Chiyuan Zhang

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work we consider the example-scarce regime, where each user has only a few examples, and obtain the following results: For approximate-DP, we give a generic transformation of any item-level DP algorithm to a user-level DP algorithm. ...For pure-DP, we present a simple technique for adapting the exponential mechanism [MT07] to the user-level setting. ... We will be intentionally vague; all definitions and results will be formalized later in the paper. At a high-level, our proof proceeds roughly as follows. First, we show that any (ε, δ)-item-level DP A with high probability satisfies a local version of user-level DP... The remainder of this section is devoted to the proof of Theorem 10.
Researcher Affiliation Collaboration Badih Ghazi Google Research Mountain View, CA, US badihghazi@gmail.com Pritish Kamath Google Research Mountain View, CA, US pritish@alum.mit.edu Ravi Kumar Google Research Mountain View, CA, US ravi.k53@gmail.com Pasin Manurangsi Google Research Bangkok, Thailand pasin@google.com Raghu Meka UCLA Los Angeles, CA, US raghum@cs.ucla.edu Chiyuan Zhang Google Research Mountain View, CA, US chiyuan@google.com
Pseudocode Yes Algorithm 1 Del Stabε,δ,A(x)
Open Source Code No The paper does not provide any specific links to source code or explicitly state that the code is publicly available.
Open Datasets No The paper is theoretical, defining input 'x Dnm' as 'nm i.i.d. samples drawn from D'. It does not mention any specific, named public datasets, nor provide links or citations for accessing data for experimental purposes.
Dataset Splits No The paper is theoretical and focuses on mathematical derivations and bounds. It does not describe any experimental setup involving dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe any experimental setup, therefore, no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not describe any experimental setup that would require specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and focuses on algorithm design and theoretical bounds. It describes mathematical parameters for algorithms (e.g., ε, δ, m) but does not provide practical experimental setup details like hyperparameter values, training schedules, or specific model architectures.