Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization
Authors: Zhiqiang Xu, Peilin Zhao, Jianneng Cao, Xiaoli Li
ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We theoretically analyze its convergence properties and empirically validate it on real-world datasets. |
| Researcher Affiliation | Academia | Institute for Infocomm Research, A*STAR, Singapore |
| Pseudocode | Yes | Algorithm 1 DSRG-EIGS |
| Open Source Code | No | The paper does not provide an explicit statement or link to the source code for the proposed DSRG-EIGS methodology. Footnote 4 refers to a third-party implementation of RG-EIGS. |
| Open Datasets | Yes | We first examine the performance of the algorithms on sparse matrices, which are downloaded from the university of Florida sparse matrix collection5. ... The statistics of resultant dense matrices are shown in Table 2, including their block sizes of uniform partitioning. We use q = 10 here. X is uniformly partitioned into qc = q/2 column blocks as well. |
| Dataset Splits | No | The paper describes how the input matrices and the variable X are partitioned into blocks for processing (e.g., "uniformly partitioned into a block matrix of size mr mc", "uniformly partitioned into qc = q/2 column blocks"), but this is not a standard train/validation/test split for model evaluation. |
| Hardware Specification | No | Both DSRG-EIGS and RG-EIGS were implemented in Matlab on a machine with Windows OS, 8G of RAM. |
| Software Dependencies | No | Both DSRG-EIGS and RG-EIGS were implemented in Matlab. |
| Experiment Setup | Yes | Both RG-EIGS and DSRG-EIGS are fed with the same initial value of X, where each entry is sampled from the standard normal distribution N(0, 1) and then they all as a whole are orthogonalized. We set αt for DSRG-EIGS to take the form of αt = η 1+ζt, where ζ is fixed to 2 throughout the experiments and η will be tuned. |