Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features
Authors: Zihao Chen, Luo Luo, Zhihua Zhang
AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, with certain restrictions on the communication allowed in the procedures, we develop tight lower bounds on communication rounds for a broad class of non-incremental algorithms under this setting. We also provide a lower bound on communication rounds for a class of (randomized) incremental algorithms. 5 Proof of Main Results In this section, we provide proofs of Theorem 2 and Theorem 4. The proof framework of these theorems are based on (Nesterov 2013). |
| Researcher Affiliation | Academia | Zihao Chen Zhiyuan College Shanghai Jiao Tong University z.h.chen@sjtu.edu.cn Luo Luo Department of Computer Science and Engineering Shanghai Jiao Tong University ricky@sjtu.edu.cn Zhihua Zhang School of Mathematical Sciences Peking University zhzhang@math.pku.edu.cn |
| Pseudocode | No | The paper defines algorithm families (Fλ,L and Iλ,L) and their operations but does not provide structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any statements or links indicating that its own source code is publicly available. |
| Open Datasets | No | This is a theoretical paper focusing on communication lower bounds for optimization algorithms, not on empirical training with specific datasets. Therefore, it does not provide access information for a dataset. |
| Dataset Splits | No | This is a theoretical paper, and it does not describe any experimental setup involving dataset splits for training, validation, or testing. |
| Hardware Specification | No | This is a theoretical paper that does not describe running experiments, and therefore, it does not specify any hardware used. |
| Software Dependencies | No | This is a theoretical paper focused on mathematical proofs and algorithm analysis, and it does not describe any specific software dependencies with version numbers for experimental reproducibility. |
| Experiment Setup | No | This is a theoretical paper that does not describe any specific experimental setup, hyperparameters, or system-level training settings. |