The Poisson Binomial Mechanism for Unbiased Federated Learning with Secure Aggregation
Authors: Wei-Ning Chen, Ayfer Ozgur, Peter Kairouz
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In Figure 3, we apply PBM with different m s (which dictate the communication cost) under a given privacy requirement ε. Note that once m is determined, the field that Sec Agg operates on will be Z2 log2(n m) and hence the communication cost becomes log2(n m) bits. In figure 1, we numerically compute the privacy guarantee of Algorithm 1 and compared it with the Gaussian mechanism. For the PBM, we fix the communication cost (i.e. fix m), vary parameter θ, and compute the corresponding R enyi DP (i.e., ε(α)) and MSE. |
| Researcher Affiliation | Collaboration | 1Department of Electrical Engineering, Stanford University 2Google Research. |
| Pseudocode | Yes | Algorithm 1 The Poisson Binomial Mechanism; Algorithm 2 The (Scalar) Poisson Binomial Mechanism; Algorithm 3 Distributed DP-SGD |
| Open Source Code | No | The paper does not provide a direct link to open-source code for the described methodology or explicitly state that the code is publicly available. |
| Open Datasets | No | The paper describes generating synthetic data: 'we generate n = 1000 client vectors with dimension d = 250, i.e., x1, ..., xn R250. Each local vector has bounded ℓ2 and ℓ norms, i.e. xi 2 1 and xi 1.' It does not refer to a pre-existing publicly available dataset with a specific name, link, or citation. |
| Dataset Splits | No | The paper describes using generated synthetic data for evaluation but does not specify explicit training, validation, or test dataset splits. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running experiments (e.g., GPU models, CPU types, or memory specifications). |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions or other libraries). |
| Experiment Setup | Yes | In Figure 3, we apply PBM with different m s (which dictate the communication cost) under a given privacy requirement ε. We set m = {2, 4, 6, 16}, and the corresponding communication costs (i.e., the logarithmic of the field size that Sec Agg operates on) are B = {11, 12, 13, 14}. In Figure 5, we perform PBM with modular clipping and set c = 30. |