Capacity Bounded Differential Privacy
Authors: Kamalika Chaudhuri, Jacob Imola, Ashwin Machanavajjhala
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our results demonstrate that these definitions possess a number of interesting properties enjoyed by differential privacy and some of its existing relaxations; additionally, common mechanisms such as the Laplace and Gaussian mechanisms enjoy better privacy guarantees for the same added noise under these definitions. We analyze well-known privacy mechanisms, such as the Laplace and the Gaussian mechanism under (lin, KL) and (lin, Renyi) capacity bounded privacy where the adversaries are the class of all linear functions. We show that restricting the capacity of the adversary does provide improvements in the privacy guarantee in many cases. We then use this to demonstrate that the popular Matrix Mechanism [18, 19, 22] gives an improvement in the privacy guarantees when considered under capacity bounded definition. Table 1: Privacy parameters of different mechanisms and divergences with a linear adversary and unrestricted. Proofs appear in the Appendix. Figure 1a plots the (lin, Renyi) upper bound, (4), and the exact value of the (lin, Renyi) parameter, as functions of α when ϵ = 1. We see the exact (lin, Renyi) is always better than (4), although the upper bound may sometimes be worse. |
| Researcher Affiliation | Academia | Kamalika Chaudhuri UC San Diego kamalika@cs.ucsd.edu Jacob Imola UC San Diego jimola@eng.ucsd.edu Ashwin Machanavajjhala Duke University ashwin@cs.duke.edu |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide a specific repository link or an explicit statement about releasing source code for the methodology described. |
| Open Datasets | No | The paper is theoretical and does not conduct empirical experiments with datasets; therefore, it does not provide access information for a dataset. |
| Dataset Splits | No | The paper is theoretical and does not conduct empirical experiments with datasets; therefore, it does not provide specific dataset split information. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running its analyses or theoretical derivations. |
| Software Dependencies | No | The paper does not provide any specific ancillary software details (e.g., library or solver names with version numbers) needed to replicate the theoretical derivations or analyses. |
| Experiment Setup | No | The paper is theoretical and does not describe an empirical experimental setup, including hyperparameters or system-level training settings. |