On Differentially Private Sampling from Gaussian and Product Distributions
Authors: Badih Ghazi, Xiao Hu, Ravi Kumar, Pasin Manurangsi
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We present new DP sampling algorithms, and show that they achieve near-optimal sample complexity in the first two settings. |
| Researcher Affiliation | Collaboration | Badih Ghazi Google Research Mountain View, CA, US badihghazi@gmail.com Xiao Hu University of Waterloo Waterloo, Canada xiaohu@uwaterloo.ca Ravi Kumar Google Research Mountain View, CA, US ravi.k53@gmail.com Pasin Manurangsi Google Research Bangkok, Thailand pasin@google.com |
| Pseudocode | Yes | Algorithm 1 SPHERICALGAUSSIANSAMPLER Parameters: B, σ > 0, and n N. Sample X1, . . . , Xn D for i = 1, . . . , n do Xtrunc i = trunc2 B(Xi) see (1) Sample Z N(0, σ2I) return Z + 1 i [n] Xtrunc i |
| Open Source Code | No | The paper does not provide any statement or link regarding the release of open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not use or refer to any specific publicly available datasets for training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not describe empirical experiments, thus no dataset split information is provided. |
| Hardware Specification | No | The paper is theoretical and does not describe empirical experiments, thus no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not describe empirical experiments, thus no specific software dependencies with version numbers are provided. |
| Experiment Setup | No | The paper is theoretical and does not describe empirical experiments, thus no specific experimental setup details like hyperparameters or training configurations are provided. |