Sharp Composition Bounds for Gaussian Differential Privacy via Edgeworth Expansion
Authors: Qinqing Zheng, Jinshuo Dong, Qi Long, Weijie Su
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we present numerical experiments to compare the Edgeworth approximation and the CLT approximation. |
| Researcher Affiliation | Academia | 1University of Pennsylvania. |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https://github.com/ enosair/gdp-edgeworth. |
| Open Datasets | No | The experiments are numerical comparisons of theoretical bounds for differential privacy, not experiments on a real-world dataset with traditional training/test/validation splits. Therefore, no training dataset is explicitly used or made available. |
| Dataset Splits | No | The experiments are numerical comparisons of theoretical bounds for differential privacy, not experiments on a real-world dataset with traditional training/test/validation splits. Therefore, no validation split is explicitly mentioned. |
| Hardware Specification | Yes | All the methods are implemented in Python5 and all the experiments are carried out on a Mac Book with 2.5GHz processor and 16GB memory. |
| Software Dependencies | No | The paper states 'All the methods are implemented in Python' but does not specify the version number of Python or any other software dependencies with their versions. |
| Experiment Setup | Yes | We let the number of compositions n vary from 1 to 10. Since the privacy guarantee decays as n increases and the resulting curves would be very close to the axes, we set θ = 3/ n for the sake of better visibility. |