Stochastic Second-Order Method for Large-Scale Nonconvex Sparse Learning Models

Authors: Hongchang Gao, Heng Huang

IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results have verified the efficiency and correctness of our proposed method.
Researcher Affiliation Academia Hongchang Gao, Heng Huang Department of Electrical and Computer Engineering, University of Pittsburgh, USA hongchanggao@gmail.com, heng.huang@pitt.edu
Pseudocode Yes Algorithm 1 Stochastic L-BFGS Algorithm for Solving Eq. (1).
Open Source Code No The paper does not provide an explicit statement about releasing source code or a link to a code repository for the described methodology.
Open Datasets Yes For the sparse linear regression model, we evaluate its performance on E2006-TFIDF dataset... For the sparse logistic regression model, we evaluate its classification performance on the RCV1-Binary dataset... Note that both datasets are available at the LIBSVM website 1https://www.csie.ntu.edu.tw/ cjlin/libsvmtools/datasets
Dataset Splits No The paper mentions training and testing data points for E2006-TFIDF (16,087 training, 3,308 testing) and RCV1-Binary (20,242 training, 677,399 testing, with a selected testing set of 5000 samples from each class), but does not explicitly provide details about a validation set or specific splitting methodology beyond providing total counts for train and test.
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models used for running the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x) that are needed to replicate the experiment.
Experiment Setup Yes Additionally, we set L = 10, M = 10, |B| = 10, and |B | = 50. The step length of each method is chosen to achieve the best performance... we set σ2 = 0.01... Toy-1 is with n = 20000, d = 2000, s = 100, s = 200, Σ = I. Toy-2 is with n = 50000, d = 5000, s = 500, s = 1000... the sparsity level s is set as 2000... we set the sparsity level s as 500.