Faster Eigenvector Computation via Shift-and-Invert Preconditioning
Authors: Dan Garber, Elad Hazan, Chi Jin, Sham, Cameron Musco, Praneeth Netrapalli, Aaron Sidford
ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We give faster algorithms and improved sample complexities for the fundamental problem of estimating the top eigenvector. |
| Researcher Affiliation | Collaboration | Dan Garber DGARBER@TTIC.EDU Toyota Technological Institute at Chicago Elad Hazan EHAZAN@CS.PRINCETON.EDU Princeton University Chi Jin CHIJIN@BERKELEY.EDU University of California, Berkeley Sham M. Kakade SHAM@CS.WASHINGTON.EDU University of Washington Cameron Musco CNMUSCO@MIT.EDU Massachusetts Institute of Technology Praneeth Netrapalli PRANEETH@MICROSOFT.COM Aaron Sidford ASID@MICROSOFT.COM Microsoft Research, New England |
| Pseudocode | No | The provided text does not contain any pseudocode or algorithm blocks. It refers to 'Algorithm 1 of our full paper', which is not included here. |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code or a link to a code repository. |
| Open Datasets | No | The paper focuses on theoretical sample complexities and discusses 'samples from a distribution D' and 'covariance matrix Σ' without specifying or providing access information for any publicly available or open dataset used for empirical evaluation. |
| Dataset Splits | No | The paper does not describe any train/validation/test dataset splits, as it is a theoretical paper focusing on algorithm design and analysis rather than empirical validation. |
| Hardware Specification | No | The paper does not specify any hardware details (like CPU, GPU models, or cloud instances) used for running experiments, as it is a theoretical paper. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x) that would be needed to reproduce the work. It refers to methods like SVRG but not specific software implementations with versioning. |
| Experiment Setup | No | The paper does not describe specific experimental setup details such as hyperparameter values, learning rates, or training schedules, as it is a theoretical paper. |