Bias Propagation in Federated Learning
Authors: Hongyan Chang, Reza Shokri
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We provide an empirical analysis based on real-world datasets. |
| Researcher Affiliation | Academia | Hongyan Chang & Reza Shokri School of Computing National University of Singapore {hongyan,reza}@comp.nus.edu.sg |
| Pseudocode | No | The paper does not contain pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | Yes | We provide details about the model, datasets, and implementations in Appendix A, and the code for the paper is available at https://github.com/privacytrustlab/bias_in_FL. |
| Open Datasets | Yes | We use the two datasets with different tasks for our empirical analysis: US Census Data and Celeb A (Liu et al., 2015)... We evaluate the effect of FL on local fairness on a real-world medical dataset, ISIC2019 (Abay et al., 2020; Tschandl et al., 2018; Combalia et al., 2019) |
| Dataset Splits | Yes | Each party has 1, 000 training points and 2, 000 test points... The train, test, and validation datasets ratio for each party is 6:2:2. |
| Hardware Specification | Yes | We run all experiments on Ubuntu with two NVIDIA TITAN RTX GPUs. |
| Software Dependencies | No | The paper mentions software like Fed ML (He et al., 2020), Py Torch (Paszke et al., 2019), and Captum (Kokhlikyan et al., 2020) but does not provide specific version numbers for these dependencies. |
| Experiment Setup | Yes | We train two-layer neural network models for all the tasks. We train a fully connected neural network model with one hidden layer of 32 neurons for Income and 64 neurons for Employment and Health tasks. For all the tasks, we use the RELU activation function. We use an SGD optimizer with a learning rate of 0.001 for centralized training on Health and Employment datasets and 0.1 for other settings, and the batch size is 32. We train the NN models for 200 epochs. In FL, each client updates the global mdoel for 1 epoch and shares it with the server. |