Streaming Principal Component Analysis in Noisy Setting
Authors: Teodor Vanislavov Marinov, Poorya Mianjy, Raman Arora
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 7. Experimental Results |
| Researcher Affiliation | Academia | 1Department of Computer Science, Johns Hopkins University, Baltimore, USA. |
| Pseudocode | No | The paper refers to 'Algorithm 2 of (Warmuth & Kuzmin, 2008)' but does not include its own pseudocode or clearly labeled algorithm block. |
| Open Source Code | No | The paper does not contain any explicit statements about releasing source code or provide links to a code repository. |
| Open Datasets | Yes | We evaluate empirical performance of our algorithms with missing data (MGDMD, Oja-MD) and partial observations (MGD-PO, Oja PO) on two real datasets, MNIST (Le Cun et al., 1998) and XRMB (Westbury, 1994)... |
| Dataset Splits | Yes | The initial learning rate η0 is chosen using cross validation on a held-out set. |
| Hardware Specification | No | The paper discusses computational complexity and runtime, but does not provide specific details on the hardware (e.g., GPU models, CPU types, or memory) used for running the experiments. |
| Software Dependencies | No | The paper discusses various algorithms and methods (e.g., MGD, Oja's algorithm) but does not list specific software dependencies with their version numbers required to replicate the experiments. |
| Experiment Setup | Yes | The learning rate for variants of MGD and Oja s algorithm is set to ηt = η0/t, for MGD-PO to ηt = r2η0/t, and for MGDMD to ηt = q2 η0/t. The initial learning rate η0 is chosen using cross validation on a held-out set. |