Differentially Private Stochastic Coordinate Descent
Authors: Georgios Damaskinos, Celestine Mendler-Dünner, Rachid Guerraoui, Nikolaos Papandreou, Thomas Parnell7176-7184
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our empirical results compare our new DP-SCD algorithm against SCD, SGD and DP-SGD. We test the performance on three popular GLM applications, namely ridge regression, logistic regression and L2regularized SVMs. |
| Researcher Affiliation | Collaboration | 1 Ecole Polytechnique F ed erale de Lausanne (EPFL), Switzerland 2University of California, Berkeley 3IBM Research, Zurich |
| Pseudocode | Yes | Algorithm 1: DP-SCD (for Problem (3)) |
| Open Source Code | Yes | Our implementation is available1. 1https://github.com/gdamaskinos/dpscd |
| Open Datasets | Yes | We provide detailed information regarding our setup in Appendix E. In particular, we describe the datasets (Year Prediction MSD, Phishing, Adult)... |
| Dataset Splits | No | The paper mentions "validation MSE" in Section 5.1 but does not provide specific details on the dataset splits (e.g., percentages or counts) for training, validation, and test sets in the main text. |
| Hardware Specification | No | The paper does not explicitly specify the hardware (e.g., CPU, GPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions popular packages like Scikit-learn, TensorFlow, and Liblinear in the context of SCD's existing implementations, but it does not provide specific version numbers for the software dependencies used in their own experimental setup. |
| Experiment Setup | No | The paper states that "We provide detailed information regarding our setup in Appendix E. In particular, we describe... the values for each hyperparameter...". This indicates the specific values are not presented in the main text. |