Faster Algorithms for User-Level Private Stochastic Convex Optimization
Authors: Andrew Lowy, Daogao Liu, Hilal Asi
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This is a theoretical paper without experiments. |
| Researcher Affiliation | Collaboration | Andrew Lowy Wisconsin Institute for Discovery University of Wisconsin-Madison alowy@wisc.edu; Daogao Liu Department of Computer Science University of Washington liudaogao@gmail.com; Hilal Asi Apple Machine Learning Research hilal.asi94@gmail.com |
| Pseudocode | Yes | Algorithm 1: User-Level DP Phased SGD with Outlier Iterate Removal and Output Perturbation; Algorithm 2: User-Level DP Accelerated Minibatch SGD( b Fi, Ti, Ki, xi 1, τ, ε, δ); Algorithm 3: User-Level DP Accelerated Phased ERM with Outlier Gradient Removal |
| Open Source Code | No | This is a theoretical paper without experiments. The paper does not provide any statement about releasing source code for the described methodology. |
| Open Datasets | No | This is a theoretical paper without experiments. The paper does not perform experiments on specific datasets. |
| Dataset Splits | No | This is a theoretical paper without experiments. The paper does not perform experiments, hence no dataset splits are provided. |
| Hardware Specification | No | This is a theoretical paper without experiments. The paper does not describe hardware used for experiments. |
| Software Dependencies | No | This is a theoretical paper without experiments. The paper does not list specific software dependencies with version numbers for experimental replication. |
| Experiment Setup | No | This is a theoretical paper without experiments. The paper does not provide specific experimental setup details such as hyperparameter values. |