Adaptive Group Personalization for Federated Mutual Transfer Learning

Authors: Haoqing Xu, Dian Shen, Meng Wang, Beilun Wang

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results show Ada Gr P achieves 16.9% average improvement in learnability structure recovery compared with state-of-the-art CFL baselines. ... Synthetic and real-world experiments: We conduct both synthetic and real-world experiments to compare the proposed Ada Gr P with state-of-the-art baselines. Results show that Ada Gr P outperforms significantly with about 96.32% learnability structure recovery accuracy compared with 91.52% of the best baseline.
Researcher Affiliation Academia 1School of Computer Science and Engineering, Southeast University, Nanjing 210096, China 2Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education, China 3College of Design and Innovation, Tongji University, Shanghai, China.
Pseudocode Yes Algorithm 1 Learnability Structure Recovery with Difference Standardization Ψpθ; λq, Algorithm 2 Adaptive Threshold Correction Λp q, Algorithm 3 Ada Gr P at time step τ
Open Source Code No The paper does not provide a statement or link indicating the availability of open-source code for the described methodology.
Open Datasets Yes We apply Ada Gr P and the above baseline methods in the NOAA n Clim Div database (Vose et al., 2014) for the average temperature prediction task.
Dataset Splits Yes The samples are divided with the proportion of 7 : 1 : 2 for training, validation, and testing.
Hardware Specification Yes All the experiments were conducted on a Linux server with two Intel(R) Xeon(R) Gold 5117 CPUs and 256 Gi B memory.
Software Dependencies No The paper does not provide specific software dependencies with version numbers (e.g., programming languages, libraries, frameworks with their versions).
Experiment Setup Yes For each method, we set the maximum number of communication rounds R 30, the maximum number of local steps T 10000. ... Learning rate: ... we found that η 0.001 has the best convergence rate... For the NOAA dataset, we choose η 0.005... Clients early stops when validation error does not drop by 50 updates.