Faster Kernel Matrix Algebra via Density Estimation
Authors: Arturs Backurs, Piotr Indyk, Cameron Musco, Tal Wagner
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we empirically evaluate the kernel noisy power method (Algorithm 1) for approximating the top eigenvalue of the kernel matrix K Rn n associated with an input dataset of n points. ... All results are reported in Figure 1 on the following page. Both the Uniform and KNPM variants of the noisy power method give a much better tradeoff in terms of accuracy vs. computation than the full power method. Additionally, KNPM consistently outperforms Uniform. |
| Researcher Affiliation | Collaboration | 1Toyota Technological Institute at Chicago, Chicago, IL, USA 2Massachusetts Institute of Technology, Cambridge, MA, USA 3University of Massachusetts Amherst, Amherst, MA, USA 4Microsoft Research Redmond, Redmond, WA, USA. |
| Pseudocode | Yes | Algorithm 1 Kernel Noisy Power Method |
| Open Source Code | No | No explicit statement or link indicating the release of source code for the described methodology. |
| Open Datasets | Yes | We use classes of the Forest Covertype dataset (Blackard & Dean, 1999)... We also use the full training set of the MNIST dataset (60K points in 784 dimensions). |
| Dataset Splits | No | The paper mentions using the 'full training set of the MNIST dataset' and the 'Forest Covertype dataset', but does not provide specific details on how these datasets were split into training, validation, and test sets, or if cross-validation was used. |
| Hardware Specification | No | No specific hardware details (like GPU/CPU models, memory, or cloud instance types) are provided. The paper states that its efficiency measure 'is software and architecture-free, unaffected by access to specialized libraries ... or hardware'. |
| Software Dependencies | No | No specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9, or specific solver versions) are mentioned. |
| Experiment Setup | Yes | We use bandwidth σ = 0.05 (other choices produce similar results). ... we start with a small sampling rate, and gradually increase it by multiplying by 1.1 in each iteration. |