Fast and Accurate Refined Nyström-Based Kernel SVM

Authors: Zhe Li, Tianbao Yang, Lijun Zhang, Rong Jin

AAAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experimental results demonstrate that (i) the obtained dual solution by our approach in the first step is closer to the optimal solution and yields improved prediction performance; and (ii) the second step using the obtained dual solution to re-train the model further improves the performance.
Researcher Affiliation Collaboration 1Department of Computer Science, The University of Iowa, Iowa City, IA 52242, USA 2National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China 3Alibaba Group, Seattle, WA 98101, USA
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide concrete access to source code for the methodology described.
Open Datasets Yes We present empirical evaluations of the proposed refined Nystr om based kernel classifier on six real-world datasets, namely usps, letter, ijcnn1, webspam, codrna and covtype, of which we use the version available on LIBSVM website 2. http://www.csie.ntu.edu.tw/ cjlin/libsvmtools/datasets/
Dataset Splits Yes Through cross-validation, we choose the best parameter C from 2[ 6:1:6] and the best parameter γ for the RBF kernel from 2[ 6:2:6].
Hardware Specification No In our experiments, we implement both the feature construction by the Nystr om method and the optimization of linear SVM in a cluster environment. The training data is randomly partitioned over 5 nodes. The description 'cluster environment' and '5 nodes' is not specific enough regarding CPU/GPU models, memory, or other detailed specifications.
Software Dependencies No We run linear SVM and kernel SVM using LIBLINEAR and LIBSVM, respectively. The paper mentions software names but does not provide specific version numbers for them.
Experiment Setup Yes Through cross-validation, we choose the best parameter C from 2[ 6:1:6] and the best parameter γ for the RBF kernel from 2[ 6:2:6].