Nonconvex Federated Learning on Compact Smooth Submanifolds With Heterogeneous Data
Authors: Jiaojiao Zhang, Jiang Hu, Anthony Man-Cho So, Mikael Johansson
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical experiments demonstrate that our algorithm has significantly smaller computational and communication overhead than existing methods. |
| Researcher Affiliation | Academia | 1 KTH Royal Institute of Technology 2 University of California, Berkeley 3 The Chinese University of Hong Kong |
| Pseudocode | Yes | Algorithm 1 Proposed algorithm |
| Open Source Code | No | Question: Does the paper provide open access to the data and code, with sufficient instructions to faithfully reproduce the main experimental results, as described in supplemental material? Answer: [No] Justification: We did not provide the code during the submission stage. |
| Open Datasets | Yes | We conduct experiments where the matrix Ai is from the Mnist dataset. |
| Dataset Splits | No | The specific experiment settings can be found in Appendix A.4.1. and The Mnist dataset consists of 60,000 handwritten digit images ranging from 0 to 9... To construct the heterogeneous Ai, we sort the rows in increasing order of their associated digits and then split every 60000/n rows, with n = 10 as the number of clients, among each client. While the paper describes how data is distributed among clients and provides dataset characteristics, it does not explicitly provide details about training, validation, and test dataset splits. |
| Hardware Specification | No | Question: For each experiment, does the paper provide sufficient information on the computer resources (type of compute workers, memory, time of execution) needed to reproduce the experiments? Answer: [No] Justification: Our experiments can be completed on a laptop. |
| Software Dependencies | No | The paper does not provide specific software dependencies or versions used for the experiments. |
| Experiment Setup | Yes | The specific experiment settings can be found in Appendix A.4.1. ... In Figs. 1, we set τ = 10 and η = 1/β for all algorithms, where β is the square of the largest singular value of col{Ai}n i=1. We set ηg = 1 to facilitate comparison with other algorithms. |