Large-Scale Learning with Fourier Features and Tensor Decompositions

Authors: Frederiek Wesel, Kim Batselier

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate by means of numerical experiments how our low-rank tensor approach obtains the same performance of the corresponding nonparametric model, consistently outperforming random Fourier features.4 Numerical experiments We implemented our Tensor-Kernel Ridge Regression (T-KRR) algorithm in Mathworks MATLAB 2021a (Update 1) [24] and tested it on several regression and classification problems.
Researcher Affiliation Academia Frederiek Wesel Delft Center for Systems and Control Delft University of Technology f.wesel@tudelft.nl Kim Batselier Delft Center for Systems and Control Delft University of Technology k.batselier@tudelft.nl
Pseudocode No The paper describes the algorithm in paragraph text but does not provide structured pseudocode or an algorithm block.
Open Source Code Yes Our implementation can be freely downloaded from https://github.com/fwesel/T-KRR and allows reproduction of all experiments in this section.
Open Datasets Yes We consider five UCI [8] datasets in order to compare the performance of our model with RFF and the GPR/KRR baseline.The Airline dataset [14, 16] is a large-scale regression problem originally considered in [14]
Dataset Splits No For each dataset, we consider 90% of the data for training and the remaining 10% for testing.Training the model is then accomplished with 2/3N datapoints, with the remaining portion reserved for testing. The paper does not explicitly specify a validation split used in their experiments.
Hardware Specification Yes All experiments were run on a Dell Inc. Latitude 7410 laptop with 16 GB of RAM and an Intel Core i7-10610U CPU running at 1.80 GHz.
Software Dependencies Yes We implemented our Tensor-Kernel Ridge Regression (T-KRR) algorithm in Mathworks MATLAB 2021a (Update 1) [24]
Experiment Setup Yes One sweep of our T-KRR algorithm is defined as updating factor matrices in the order 1 D and then back from D 1. All initial factor matrices were initialized with standard normal numbers and normalized by dividing all entries with their Frobenius norm. For all experiments the number of sweeps of T-KRR algorithm are set to 10.