Differentially Private Optimization with Sparse Gradients

Authors: Badih Ghazi, Cristóbal Guzmán, Pritish Kamath, Ravi Kumar, Pasin Manurangsi

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical The paper does not include experiments. All results have full proofs and are self-contained. Guidelines: The answer NA means that the abstract and introduction do not include the claims made in the paper. The abstract and/or introduction should clearly state the claims made, including the contributions made in the paper and important assumptions and limitations. A No or NA answer to this question will not be perceived well by the reviewers. The claims made should match theoretical and experimental results, and reflect how much the results can be expected to generalize to other settings. It is fine to include aspirational goals as motivation as long as it is clear that these goals are not attained by the paper.
Researcher Affiliation Collaboration Badih Ghazi Google Research badihghazi@google.com Cristobal Guzman Google Research and Pontificia Universidad Catolica de Chile crguzman@google.com Pritish Kamath Google Research pritishk@google.com Ravi Kumar Google Research ravi.k53@gmail.com Pasin Manurangsi Google Research pasin@google.com
Pseudocode Yes Algorithm 1 Projection_Mechanism( z(S), ε, δ, n) ... Algorithm 2 Subsampled_Bias-Reduced_Gradient_Estimator(x, S, N, ε, δ) ... Algorithm 3 Subsampled_Bias-Reduced_Sparse_SGD(x0, S, ε, δ) ... Algorithm 4 Output_Perturbation ... Algorithm 5 Gaussian ℓ1-Recovery( z(S), ε, δ, n) ... Algorithm 6 Boosting_Bias-Reduced_SGD(S, ε, δ, K) ... Algorithm 7 Sparse_Exponential_Mechanism
Open Source Code No The paper states: 'The paper does not include experiments requiring code.' and 'While Neur IPS does not require releasing code, the conference does require all submissions to provide some reasonable avenue for reproducibility, which may depend on the nature of the contribution.' There is no explicit statement about releasing code or a link to a code repository.
Open Datasets No The paper states that it is theoretical research and does not include experiments or datasets.
Dataset Splits No The paper states that it is theoretical research and does not include experiments or datasets, thus no dataset splits are provided.
Hardware Specification No The paper states: 'The paper does not include experiments.' Therefore, no hardware specifications are mentioned for running experiments.
Software Dependencies No The paper states: 'The paper does not include experiments.' Therefore, no software dependencies with version numbers are mentioned for replicating experiments.
Experiment Setup No The paper states: 'The paper does not include experiments.' Therefore, no specific experimental setup details like hyperparameters or training settings are provided.