Scalable and Efficient Pairwise Learning to Achieve Statistical Accuracy

Authors: Bin Gu, Zhouyuan Huo, Heng Huang3697-3704

AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experimental results on a variety of real-world datasets not only confirm the effectiveness of our Ada DSG algorithm, but also show that Ada DSG has significantly better scalability and efficiency than the existing pairwise learning algorithms.
Researcher Affiliation Collaboration Bin Gu,1 Zhouyuan Huo,2 Heng Huang1,2 1JDDGlobal.com 2Department of Electrical & Computer Engineering, University of Pittsburgh, USA
Pseudocode Yes Algorithm 1 Adaptive doubly stochastic gradient algorithm (Ada DSG) ... Algorithm 2 DSGD algorithm
Open Source Code No We implemented our Ada DSG algorithm in MATLAB. (The paper mentions implementing its own algorithm but does not provide a link or explicit statement about making its source code available. It provides links to code for other algorithms used for comparison.)
Open Datasets Yes Table 3 summarizes the eight real-world benchmark datasets used in our experiments. They are the A9a, Covtype, Ijcnn1, Phishing, Usps, Mnist, Rcv1 and Real-sim datasets from the LIBSVM repository2. 2The LIBSVM repository is available at https://www.csie.ntu.edu.tw/ cjlin/libsvmtools/datasets/.
Dataset Splits No We randomly partitioned each dataset into 75% for training and 25% for testing. (The paper specifies the training and testing splits but does not explicitly mention a validation dataset split or its size/percentage.)
Hardware Specification Yes Our experiments were performed on an 8-core Intel Xeon E3-1240 machine.
Software Dependencies No We implemented our Ada DSG algorithm in MATLAB. (The paper mentions MATLAB but does not provide a specific version number for MATLAB or any other software dependencies with version numbers.)
Experiment Setup Yes For our Ada DSG algorithm, the initial learning rate γ0 was tuned from 1 to 10 4, and the outer loop number was set as 20. In the implementation of our Ada DSG algorithm, we set Vn = 1 n, and set the inner loop number of DSGD for the subproblem Rm as m.