Addressing Class Imbalance in Federated Learning
Authors: Lixu Wang, Shichao Xu, Xiao Wang, Qi Zhu10165-10173
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments demonstrate the importance of acknowledging class imbalance and taking measures as early as possible in FL training, and the effectiveness of our method in mitigating the impact. Our method is shown to significantly outperform previous methods, while maintaining client privacy. |
| Researcher Affiliation | Academia | Lixu Wang, Shichao Xu, Xiao Wang, Qi Zhu Northwestern University, Evanston, IL, USA {lixuwang2025, shichaoxu2023}@u.northwestern.edu, {wangxiao, qzhu}@northwestern.edu |
| Pseudocode | No | The paper describes its methods using prose and mathematical equations but does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We choose four different datasets: MNIST, CIFAR10, Fer2013 (Goodfellow et al. 2013), and FEMNIST of LEAF benchmark (Caldas et al. 2018). |
| Dataset Splits | Yes | For all data sets, we allocate them to clients without replacement, and the detailed data splitting is visualized in the SM. |
| Hardware Specification | No | Please refer to the SM for more details about the auxiliary data, and the setting for hardware. (The main paper does not specify hardware details without accessing the supplementary material). |
| Software Dependencies | No | We implement the algorithms mainly in Py Torch. (No specific version numbers are provided for PyTorch or any other software dependencies). |
| Experiment Setup | Yes | The local training batch size is 32, the learning rate λ=0.001, and the optimizer is SGD. |