Elliptical Perturbations for Differential Privacy

Authors: Matthew Reimherr, Jordan Awan

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We study elliptical distributions in locally convex vector spaces, and determine conditions when they can or cannot be used to satisfy differential privacy (DP). We show that elliptical distributions with the same dispersion operator, C, are equivalent if the difference of their means lies in the Cameron-Martin space of C. In the case of releasing finite-dimensional summaries using elliptical perturbations, we show that the privacy parameter ϵ can be computed in terms of a one-dimensional maximization problem. We apply this result to consider multivariate Laplace, t, Gaussian, and K-norm noise. Surprisingly, we show that the multivariate Laplace noise does not achieve ϵ-DP in any dimension greater than one. Finally, we show that when the dimension of the space is infinite, no elliptical distribution can be used to give ϵ-DP; only (ϵ, δ)-DP is possible. This work also highlights the need for more tools when the statistical summaries are complex objects such as functions. Properties that hold in finite dimensions may not hold in infinite dimensions in some surprisingly subtle ways.
Researcher Affiliation Academia Matthew Reimherr Department of Statistics Pennsylvania State University University Park, PA 16802 mreimherr@psu.edu Jordan Awan Department of Statistics Pennsylvania State University University Park, PA 16802 awan@psu.edu
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide any statements or links regarding the availability of open-source code for the methodology described.
Open Datasets No The paper is theoretical and does not describe any experiments that would involve training on a dataset. It does not mention any publicly available or open datasets for training.
Dataset Splits No The paper is theoretical and does not describe any experiments involving data. Therefore, it does not provide any training/validation/test dataset splits.
Hardware Specification No The paper is theoretical and does not describe any experiments. Therefore, it does not specify any hardware used for running experiments.
Software Dependencies No The paper is theoretical and does not describe any experiments. Therefore, it does not specify any software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe any experiments. Therefore, it does not provide details about an experimental setup, such as hyperparameters or training settings.