Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition
Authors: Zeyuan Allen-Zhu, Yuanzhi Li
ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study k-Gen EV, the problem of finding the top k generalized eigenvectors, and k-CCA, the problem of finding the top k vectors in canonicalcorrelation analysis. We propose algorithms Lazy EV and Lazy CCA to solve the two problems with running times linearly dependent on the input size and on k. Furthermore, our algorithms are doubly-accelerated: our running times depend only on the square root of the matrix condition number, and on the square root of the eigengap. This is the first such result for both k Gen EV or k-CCA. We also provide the first gapfree results, which provide running times that depend on 1/ ε rather than the eigengap. |
| Researcher Affiliation | Collaboration | 1Microsoft Research 2Princeton University. Correspondence to: Zeyuan Allen-Zhu <zeyuan@csail.mit.edu>, Yuanzhi Li <yuanzhil@cs.princeton.edu>. |
| Pseudocode | Yes | Algorithm 1 Appx PCA (A, M, δ , ε, p) ... Algorithm 2 Lazy EV(A, M, k, δ , εpca, p) |
| Open Source Code | No | The paper does not include an unambiguous statement or a direct link to the source code for the methodology described. |
| Open Datasets | No | This paper is theoretical and does not mention using or training on any specific publicly available datasets. |
| Dataset Splits | No | This paper is theoretical and does not mention any dataset splits for validation. |
| Hardware Specification | No | This is a theoretical paper and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | This is a theoretical paper and does not mention specific software dependencies with version numbers. |
| Experiment Setup | No | This is a theoretical paper and does not describe any experimental setup details such as hyperparameters or training configurations. |