Towards Addressing Label Skews in One-Shot Federated Learning
Authors: Yiqun Diao, Qinbin Li, Bingsheng He
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our extensive experiments show that Fed OV can significantly improve the test accuracy compared to state-of-the-art approaches in various label skew settings. |
| Researcher Affiliation | Academia | Yiqun Diao, Qinbin Li & Bingsheng He National University of Singapore {yiqun,qinbin,hebs}@comp.nus.edu.sg |
| Pseudocode | Yes | Algorithm 1: The Fed OV algorithm. |
| Open Source Code | Yes | Code is available at https://github.com/Xtra-Computing/Fed OV. |
| Open Datasets | Yes | Datasets We conduct experiments on MNIST, Fashion-MNIST, CIFAR-10 and SVHN datasets. We use the data partitioning methods in Li et al. (2021b) to simulate different label skews. |
| Dataset Splits | Yes | In each task, we use a half of the test dataset as the public dataset for distillation for Fed KT and Fed DF and the remaining for testing. |
| Hardware Specification | Yes | All experiments are conducted on a single 3090 GPU. |
| Software Dependencies | No | The paper mentions 'Adam optimizer' and 'Re Lu as the activation function' but does not provide specific version numbers for any software dependencies like programming languages or libraries. |
| Experiment Setup | Yes | For local training, we run 200 local epochs for each client. We set batch size to 64 and learning rate to 0.001. For PROSER, we choose β = 0.01, γ = 1, according to the default trade-off parameter setting in the official code1. For adversarial learning, we set 5 local steps and each step size 0.002. |