Is Input Sparsity Time Possible for Kernel Low-Rank Approximation?

Authors: Cameron Musco, David Woodruff

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We show that for a broad class of kernels, including the popular Gaussian and polynomial kernels, computing a relative error k-rank approximation to K is at least as difficult as multiplying the input data matrix A 2 Rn d by an arbitrary matrix C 2 Rd k. Barring a breakthrough in fast matrix multiplication, when k is not too large, this requires (nnz(A)k) time where nnz(A) is the number of non-zeros in A. This lower bound matches, in many parameter regimes, recent work on subquadratic time algorithms for low-rank approximation of general kernels [MM16, MW17], demonstrating that these algorithms are unlikely to be significantly improved, in particular to O(nnz(A)) input sparsity runtimes. At the same time there is hope: we show for the first time that O(nnz(A)) time approximation is possible for general radial basis function kernels (e.g., the Gaussian kernel) for the closely related problem of low-rank approximation of the kernelized dataset.
Researcher Affiliation Academia Cameron Musco MIT cnmusco@mit.edu David P. Woodruff Carnegie Mellon University dwoodruf@cs.cmu.edu
Pseudocode No No pseudocode or algorithm blocks were found.
Open Source Code No The paper does not provide any statement or link for open-source code specific to the methodology described.
Open Datasets No This is a theoretical paper that does not involve empirical evaluation with datasets. Therefore, no information about public dataset availability is provided.
Dataset Splits No This is a theoretical paper and does not discuss dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe experimental hardware specifications.
Software Dependencies No The paper is theoretical and does not describe software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not provide details on experimental setup or hyperparameters.