A Scalable and Extensible Framework for Superposition-Structured Models

Authors: Shenjian Zhao, Cong Xie, Zhihua Zhang

AAAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical analysis on various datasets shows that our framework is potentially powerful, and achieves super-linear convergence rate for optimizing some popular superposition-structured statistical models such as the fused sparse group lasso. We implement all the experiments on a single machine running the 64-bit version of Linux with an Intel Core i5-3470 CPU and 8 GB RAM. We test the SEP-QN framework on various real-world datasets...
Researcher Affiliation Academia Shenjian Zhao and Cong Xie and Zhihua Zhang Department of Computer Science and Engineering Shanghai Jiao Tong University {zhao1014,xcgoner,zhihua}@sjtu.edu.cn
Pseudocode Yes Algorithm 1 gives the basic framework of SEP-QN. In Algorithm 2 we present the method of solving the problem (3). Algorithm 3 Adaptive Initial Hessian
Open Source Code No The paper does not contain any statement about releasing their own code or provide a link to it. It only mentions existing libraries like LIBLINEAR and GLMNET.
Open Datasets Yes We test the SEP-QN framework on various real-world datasets such as gisette (n = 6, 000 and p = 5, 000) and epsilon (n = 300, 000 and p = 2, 000) which can be downloaded from LIBSVM website1. 1http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets
Dataset Splits No Table 1 lists 'n (train)' and 'n (test)' for the datasets, but it does not explicitly mention any validation dataset splits or specific cross-validation setups. While train/test splits are provided, the validation split is not.
Hardware Specification Yes We implement all the experiments on a single machine running the 64-bit version of Linux with an Intel Core i5-3470 CPU and 8 GB RAM.
Software Dependencies No The paper mentions existing tools like LIBLINEAR, GLMNET, CVX, and TFOCS that were used or compared against, but it does not specify software dependencies (e.g., programming languages, libraries, or framework versions) for their own implementation.
Experiment Setup Yes For fairness of comparison, we use the same dataset gisette and the same setting of the tuning parameter λ as (Lee, Sun, and Saunders 2012; Yuan, Ho, and Lin 2012). For fused sparse logistic regression (λ1 = 2/n, λ2 = 2/n, N = 2) and group sparse logistic model (λ1 = 2/n, λ2 = 0, γj = 2/n).