The Discrete Gaussian for Differential Privacy

Authors: Clément L. Canonne, Gautam Kamath, Thomas Steinke

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Specifically, we theoretically and experimentally show that adding discrete Gaussian noise provides essentially the same privacy and accuracy guarantees as the addition of continuous Gaussian noise. We also present an simple and efficient algorithm for exact sampling from this distribution.
Researcher Affiliation Collaboration Clément L. Canonne IBM Research, Almaden ccanonne@cs.columbia.edu Gautam Kamath University of Waterloo g@csail.mit.edu Thomas Steinke IBM Research, Almaden dgauss@thomas-steinke.net
Pseudocode Yes In Algorithm 1, we present a method to efficiently sample exactly from a discrete Gaussian on a finite computer given access only to uniformly random bits. This satisfies the guarantee in Theorem 13. ... Algorithm 1 requires sampling from Bernoulli(exp( γ)) as a subroutine; we show how to do this in Algorithm 2.
Open Source Code Yes [Dga] https : / / github . com / IBM / discrete gaussian differential privacy. 2020.
Open Datasets No The paper focuses on theoretical properties and numerical comparisons of distribution characteristics rather than experiments involving datasets that require training, validation, or testing splits. Therefore, the concept of a publicly available dataset for training is not applicable in the context of this paper's experiments.
Dataset Splits No The paper focuses on theoretical properties and numerical comparisons of distribution characteristics rather than experiments involving datasets that require training, validation, or testing splits. Therefore, specific dataset split information for validation is not applicable.
Hardware Specification No The paper does not provide any specific details about the hardware used for running its numerical comparisons or simulations.
Software Dependencies No The paper mentions that "Python code for Algorithm 1 (and our Figures) is available online [Dga]", implying the use of Python, but it does not specify any version numbers for Python or any other software libraries used.
Experiment Setup No The paper defines parameters for the distributions (e.g., sigma, epsilon, delta), but these are mathematical properties and not experimental setup parameters in the sense of hyperparameter tuning or specific system configurations for training models. No dedicated section or explicit details on experimental setup are provided.