On Convergence of FedProx: Local Dissimilarity Invariant Bounds, Non-smoothness and Beyond

Authors: Xiaotong Yuan, Ping Li

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our work is focused on providing deeper theoretical understandings of Fed Prox and its stochastic variants under milder conditions. 3. If you ran experiments... [N/A] Our work is focused on providing deeper theoretical understandings of Fed Prox and its stochastic variants under milder conditions.
Researcher Affiliation Collaboration Xiao-Tong Yuan Nanjing Univ. of Information Sci. & Tech. 219 Ningliu Rd, Nanjing, China xtyuan1980@gmail.com Ping Li Linked In Ads 700 Bellevue Way NE, Bellevue, WA, USA pinli@linkedin.com
Pseudocode Yes Algorithm 1: Fed MSPP: Federated Minibatch Stochastic Proximal Point
Open Source Code No 4. If you are using existing assets (e.g., code, data, models) or curating/releasing new assets... [N/A] We did not use any of such assets in this work.
Open Datasets No 3. If you ran experiments... [N/A] Our work is focused on providing deeper theoretical understandings of Fed Prox and its stochastic variants under milder conditions.
Dataset Splits No 3. If you ran experiments... [N/A] Our work is focused on providing deeper theoretical understandings of Fed Prox and its stochastic variants under milder conditions.
Hardware Specification No 3. If you ran experiments... [N/A] Our work is focused on providing deeper theoretical understandings of Fed Prox and its stochastic variants under milder conditions.
Software Dependencies No The paper does not describe experimental implementations and therefore does not list specific software dependencies with version numbers.
Experiment Setup No 3. If you ran experiments... [N/A] Our work is focused on providing deeper theoretical understandings of Fed Prox and its stochastic variants under milder conditions.