HONOR: Hybrid Optimization for NOn-convex Regularized problems

Authors: Pinghua Gong, Jieping Ye

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct empirical studies on large-scale data sets and results demonstrate that HONOR converges significantly faster than state-of-the-art algorithms.
Researcher Affiliation Academia Pinghua Gong Univeristy of Michigan, Ann Arbor, MI 48109 gongp@umich.edu Jieping Ye Univeristy of Michigan, Ann Arbor, MI 48109 jpye@umich.edu
Pseudocode Yes Algorithm 1: HONOR: Hybrid Optimization for NOn-convex Regularized problems
Open Source Code No The paper does not provide an explicit statement about releasing source code for the described methodology or a link to a code repository.
Open Datasets Yes All data sets can be downloaded from http://www.csie.ntu.edu.tw/ cjlin/libsvmtools/datasets/.
Dataset Splits No The paper mentions using 'large-scale data sets' for experiments but does not provide specific details on training, validation, or test splits (e.g., percentages, sample counts, or cross-validation setup).
Hardware Specification Yes All algorithms are implemented in Matlab 2015a under a Linux operating system and executed on an Intel Core i7-4790 CPU (@3.6GHz) with 32GB memory.
Software Dependencies Yes All algorithms are implemented in Matlab 2015a under a Linux operating system
Experiment Setup Yes We terminate the compared algorithms if the relative change of two consecutive objective function values is less than 10 5 or the number of iterations exceeds 1000 (HONOR) or 10000 (GIST). For HONOR, we set γ = 10 5, β = 0.5, α0 = 1 and the number of unrolling steps in L-BFGS as m = 10. For GIST, we use the non-monotone line search in experiments as it usually performs better than its monotone counterpart. To show how the convergence behavior of HONOR varies over the parameter ǫ, we use three values: ǫ = 10 10, 10 6, 10 2.