Communication-Efficient Distributed SVD via Local Power Iterations
Authors: Xiang Li, Shusen Wang, Kun Chen, Zhihua Zhang
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct experiments to demonstrate the effectiveness of Local Power. |
| Researcher Affiliation | Academia | 1School of Mathematical Sciences, Peking University, China 2Department of Computer Science, Stevens Institute of Technology, USA. |
| Pseudocode | Yes | Algorithm 1 Local Power |
| Open Source Code | No | The paper does not include an explicit statement about releasing source code for the described methodology or a direct link to a code repository. |
| Open Datasets | Yes | We use 15 datasets available on the LIBSVM website.4 This page contains them all. https://www.csie.ntu. edu.tw/~cjlin/libsvmtools/datasets/. |
| Dataset Splits | No | The paper states that data samples are 'randomly shuffled and then partitioned among m nodes' but does not specify explicit train/validation/test dataset splits for reproducibility. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers needed to replicate the experiment. |
| Experiment Setup | Yes | All the algorithms start from the same initialization Y0. We fix the target rank to k = 5. We set m = max( n 1000 , 3) so that each node has s = 1, 000 samples, unless n is too small. For three variants of Local Power we fix p = 4 (without decaying p). |