Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features
Authors: Zihao Chen, Luo Luo, Zhihua Zhang
AAAI 2017 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, with certain restrictions on the communication allowed in the procedures, we develop tight lower bounds on communication rounds for a broad class of non-incremental algorithms under this setting. We also provide a lower bound on communication rounds for a class of (randomized) incremental algorithms. 5 Proof of Main Results In this section, we provide proofs of Theorem 2 and Theorem 4. The proof framework of these theorems are based on (Nesterov 2013). |
| Researcher Affiliation | Academia | Zihao Chen Zhiyuan College Shanghai Jiao Tong University EMAIL Luo Luo Department of Computer Science and Engineering Shanghai Jiao Tong University EMAIL Zhihua Zhang School of Mathematical Sciences Peking University EMAIL |
| Pseudocode | No | The paper defines algorithm families (Fλ,L and Iλ,L) and their operations but does not provide structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any statements or links indicating that its own source code is publicly available. |
| Open Datasets | No | This is a theoretical paper focusing on communication lower bounds for optimization algorithms, not on empirical training with specific datasets. Therefore, it does not provide access information for a dataset. |
| Dataset Splits | No | This is a theoretical paper, and it does not describe any experimental setup involving dataset splits for training, validation, or testing. |
| Hardware Specification | No | This is a theoretical paper that does not describe running experiments, and therefore, it does not specify any hardware used. |
| Software Dependencies | No | This is a theoretical paper focused on mathematical proofs and algorithm analysis, and it does not describe any specific software dependencies with version numbers for experimental reproducibility. |
| Experiment Setup | No | This is a theoretical paper that does not describe any specific experimental setup, hyperparameters, or system-level training settings. |