Proximal SCOPE for Distributed Sparse Learning

Authors: Shenyi Zhao, Gong-Duo Zhang, Ming-Wei Li, Wu-Jun Li

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on real data sets show that p SCOPE can outperform other state-of-the-art distributed methods for sparse learning.
Researcher Affiliation Academia Shen-Yi Zhao National Key Lab. for Novel Software Tech. Dept. of Comp. Sci. and Tech. Nanjing University, Nanjing 210023, China zhaosy@lamda.nju.edu.cn
Pseudocode Yes Algorithm 1 Proximal SCOPE
Open Source Code No The paper states that the datasets can be downloaded from the Lib SVM website, but there is no explicit statement or link indicating that the authors' own source code for the methodology is publicly available.
Open Datasets Yes Evaluation is based on four datasets in Table 1: cov, rcv1, avazu, kdd2012. All of them can be downloaded from Lib SVM website 2. https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/
Dataset Splits No The paper describes a 'uniform partition' strategy for distributing data among workers, but does not provide specific percentages or sample counts for train/validation/test dataset splits.
Hardware Specification Yes The CPU for each machine has 12 Intel E5-2620 cores, and the memory of each machine is 96GB. The machines are connected by 10GB Ethernet.
Software Dependencies No The paper mentions implementing methods and using libraries (e.g., 'Lib SVM'), but it does not specify any software components with their corresponding version numbers required for reproduction.
Experiment Setup Yes We run p SCOPE and stop it when the gap P(w) P(w ) 10 6. For each dataset, we construct four data partitions: π (each worker has the whole data), π1 (uniform partition); π2 (75% positive instances and 25% negative instances are on the first 4 workers, and other instances are on the last 4 workers), π3 (all positive instances are on the first 4 workers, and all negative instances are on the last 4 workers).