LDP-FL: Practical Private Aggregation in Federated Learning with Local Differential Privacy
Authors: Lichao Sun, Jianwei Qian, Xun Chen
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical evaluations on three commonly used datasets in prior differential privacy work, MNIST, Fashion-MNIST and CIFAR-10, demonstrate that our solution can not only achieve superior deep learning performance but also provide a strong privacy guarantee at the same time. |
| Researcher Affiliation | Collaboration | Lichao Sun1 , Jianwei Qian2 , Xun Chen2 1Lehigh University 2Samsung Research America |
| Pseudocode | Yes | Algorithm 1: LDP-FL, Algorithm 2: Data Pertubation, Algorithm 3: Parameter Shuffling |
| Open Source Code | No | The paper does not provide any links to source code or explicit statements about code availability. |
| Open Datasets | Yes | Empirical evaluations on three commonly used datasets in prior differential privacy work, MNIST, Fashion-MNIST and CIFAR-10 |
| Dataset Splits | No | The paper mentions 'The training data and the testing data are fed into the network directly in each client' but does not specify a validation split or provide specific percentages for dataset partitioning. |
| Hardware Specification | Yes | The proposed models are implemented using Pytorch, and all experiments are done with a single GPU NVIDIA Tesla V100 on the local server. |
| Software Dependencies | No | The proposed models are implemented using Pytorch, but no specific version number for PyTorch or other software dependencies is provided. |
| Experiment Setup | Yes | The learning rate γ is set as 0.03 for MNIST/FMNIST and 0.015 for CIFAR-10. For each weight, we clip them in a fixed range. In this work, we set (c, r) = (0, 0.075) and (0, 0.015) by default for MNIST and FMNIST, respectively. However, for CIFAR-10, instead of using a fixed range, due to the complexity of the model, we set c and r adaptively by the weight range of each layer. For MNIST, FMNIST and CIFAR, we set the number of total clients as 100, 200, 500 respectively. Experiments on MNIST and FMNIST can be finished within an hour with 10 CRs, and experiments on CIFAR-10 need about 2 hours with 15 CRs. |