Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

FedLab: A Flexible Federated Learning Framework

Authors: Dun Zeng, Siqi Liang, Xiangjing Hu, Hui Wang, Zenglin Xu

JMLR 2023 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Fed Lab is a lightweight open-source framework for the simulation of federated learning. The design of Fed Lab focuses on federated learning algorithm effectiveness and communication efficiency. It allows customization on server optimization, client optimization, communication agreement, and communication compression. Also, Fed Lab is scalable in different deployment scenarios with different computation and communication resources. We hope Fed Lab could provide flexible APIs as well as reliable baseline implementations and relieve the burden of implementing novel approaches for researchers in the FL community. ... Users can easily conduct intensive comparison experiments based on benchmark datasets and baseline algorithms integrated in Fed Lab.
Researcher Affiliation Academia 1 University of Electronic Science and Technology of China 2 Peng Cheng Lab 3 Harbin Institute of Technology Shenzhen 4 Shenzhen Research Institute, The Chinese University of Hong Kong
Pseudocode Yes Listing 1: Code examples for server and client 1 # ==== Server Example 2 smodel = Alex Net() 3 shandler = Server Handler(smodel) # Optimization part 4 snetwork = Dist Network((server_ip, server_port), world_size, server_rank) # Configuration 5 smanager = Server Manager(handler, network) # Communication part 6 smanager.run() 7 8 # ==== Client Example 9 cmodel = Alex Net() 10 ctrainer = Trainer(cmodel, train_loader, optimizer, criterion) # Optimization part 11 cnetwork = Dist Network((server_ip, server_port), world_size, client_rank) # Configuration 12 cmanager = Client Manager(trainer, network) # Communication part 13 cmanager.run()
Open Source Code Yes The source code, tutorial, and documentation can be found at https://github.com/SMILELab-FL/Fed Lab.
Open Datasets No A series of data partition schemes for both IID and Non-IID from different data distribution settings (Yurochkin et al., 2019; Acar et al., 2021; Caldas et al., 2018; Li et al., 2021a) are already provided, including more than 12 data partition schemes for 15 datasets, as shown in Data Partition and Dataset of Figure 1(b). ... Users can easily conduct intensive comparison experiments based on benchmark datasets and baseline algorithms integrated in Fed Lab.
Dataset Splits No A series of data partition schemes for both IID and Non-IID from different data distribution settings (Yurochkin et al., 2019; Acar et al., 2021; Caldas et al., 2018; Li et al., 2021a) are already provided, including more than 12 data partition schemes for 15 datasets, as shown in Data Partition and Dataset of Figure 1(b).
Hardware Specification No Each computer could simulate an arbitrary number of clients calculation tasks depending on hardware resources.
Software Dependencies No For best compatibility with interfaces of Py Torch (Paszke et al., 2019), communication APIs of Fed Lab are built for tensor communication. ... Fed Lab is built on the most popular ML framework Py Torch (according to a study on Hugging Face and Pageswith Code 1)
Experiment Setup No The paper provides code examples but does not specify concrete hyperparameters, optimizer settings, or other detailed training configurations used for experiments.