Gradient Descent Meets Shift-and-Invert Preconditioning for Eigenvector Computation

Authors: Zhiqiang Xu

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test our algorithm on both synthetic and real data. Throughout experiments, our SI-rg EIGS solver is warm-started by a few power iterations, and four iterations of Nesterov s AGD are run to approximately solve the least-squares subproblems. The same initial x0 is used for different solvers. All the algorithms are implemented in matlab and running single threaded. All the ground-truth information is obtained by matlab s eigs function for benchmarking purpose.
Researcher Affiliation Industry Zhiqiang Xu Cognitive Computing Lab (CCL), Baidu Research National Engineering Laboratory of Deep Learning Technology and Application, China xuzhiqiang04@baidu.com
Pseudocode Yes Algorithm 1 Shift-and-Inverted Riemannian Gradient Descent Eigensolver
Open Source Code Yes The implementation of our algorithm is available at https://github.com/zhiqiangxu2001/SI-rg EIGS.
Open Datasets Yes We follow Shamir [2015] to generate synthetic data. Note that A s full eigenvalue decomposition can be written as A = VnΣV n , where Σ is diagonal. Thus, it suffices to generate random orthogonal matrix Vn and set Σ = diag(1, 1 , 1 1.1 , , 1 1.4 , g1/n, , gn 6/n) with gi being standard normal samples, i.e., gi N(0, 1). Here we set n = 1000 and σ = 1.005
Dataset Splits No The paper does not provide specific dataset split information (exact percentages, sample counts, or detailed splitting methodology) for training, validation, or testing.
Hardware Specification No The paper states that algorithms are implemented in MATLAB and run single-threaded, but it does not provide specific hardware details (e.g., CPU/GPU models, memory).
Software Dependencies No The paper states that algorithms are 'implemented in matlab' but does not specify the MATLAB version or any other software dependencies with version numbers.
Experiment Setup Yes Throughout experiments, our SI-rg EIGS solver is warm-started by a few power iterations, and four iterations of Nesterov s AGD are run to approximately solve the least-squares subproblems. The same initial x0 is used for different solvers. Constant step-sizes are hand-tuned. For real data, we explore an automatic step-size scheme, specifically, Barzilai-Borwein (BB) step-size, which is a non-monotone step-size scheme and performs well in practice [Wen and Yin, 2013]. In our context, it is set as follows: αt+1 = xt xt 1 2 2 /|(xt xt 1) (ˆgt ˆgt 1)|, or αt+1 = |(xt xt 1) (ˆgt ˆgt 1)| / ˆgt ˆgt 1 2 2 .