Shifted Interpolation for Differential Privacy
Authors: Jinho Bok, Weijie J Su, Jason Altschuler
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | As a proof of concept, here we consider regularized logistic regression on MNIST (Le Cun et al., 2010). We compare our results with the state-of-the-art R enyi DP bounds, and existing f-DP bounds (based on the composition theorem) which we denote as GDP Composition. For a fair comparison, we use the same algorithm Noisy CGD, with all parameters unchanged, and only focus on the privacy accounting. Table 1 demonstrates that for this problem, our improved privacy guarantees are tighter, enabling longer training for the same privacy budget which helps both training and testing accuracy (c.f., Table 2). For full details of the experiment, see D.2. |
| Researcher Affiliation | Academia | Jinho Bok 1 Weijie J. Su 1 Jason Altschuler 1 1Department of Statistics and Data Science, University of Pennsylvania, Philadelphia, PA, USA. |
| Pseudocode | No | The paper describes algorithms textually but does not include any explicit pseudocode blocks or algorithm listings. |
| Open Source Code | Yes | Code reproducing our numerics can be found here: https://github.com/ jinhobok/shifted_interpolation_dp. |
| Open Datasets | Yes | As a proof of concept, here we consider regularized logistic regression on MNIST (Le Cun et al., 2010). |
| Dataset Splits | Yes | The MNIST dataset has n = 60000 training data points and 10000 test data points; |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory specifications) used for running the experiments. It only mentions general experimental parameters and dataset usage. |
| Software Dependencies | No | The paper mentions using a 'framework of privacy loss random variables' and 'standard nonlinear equation solvers' but does not specify software names with version numbers (e.g., 'PyTorch 1.9', 'Python 3.8'). |
| Experiment Setup | Yes | For both Noisy CGD and Noisy SGD, we set the parameters as C = 8, η = 0.05, b = 1500, σ = 1/100, L = 10, E {50, 100, 200} and λ {0.002, 0.004}. |