Sparse Spectrum Warped Input Measures for Nonstationary Kernel Learning

Authors: Anthony Tompkins, Rafael Oliveira, Fabio T. Ramos

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The method is extensively validated alongside related algorithms on synthetic and real world datasets. We experimentally validate SSWIM alongside various state of the art methods in both small and large data regimes as well as expand upon the intuition in Section 4.1.1 by examining specific aspects of the model.
Researcher Affiliation Collaboration Anthony Tompkins1, Rafael Oliveira1,2, Fabio Ramos1,3 1 School of Computer Science, the University of Sydney, Australia 2 ARC Centre for Data Analytics for Resources and Environments, Australia 3 NVIDIA, USA
Pseudocode Yes Algorithm 1 Sparse Spectrum Warped Input Measures Input: {X, y} Output: θ = {θu, θg, θh, Xg, Yg, Xh, Yh} Initialize pseudo-training points {Xg, Yg}, {Xh, Yh} for t {1, . . . , T} do Fit g and h to {Xg, Yg}, {Xh, Yh} Compute ˆm and Σm for X Fit u using expected feature map Calculate log p(y|θ) Update gradients and take new step. end for
Open Source Code Yes Py Torch Code is provided to reproduce the experimental results.
Open Datasets Yes We compare our model on various real-world datasets including multiple regression tasks [44, 45, 46]. [44] Dheeru Dua and Casey Graff. UCI machine learning repository. http://archive.ics.uci.edu/ml, 2017. [45] Luís Torgo. Regression datasets. https://www.dcc.fc.up.pt/~ltorgo/Regression/Data Sets. html, 2019.
Dataset Splits No The paper states: "We use 2/3 of the samples for training and the remaining 1/3 for testing." It does not specify a separate validation split or cross-validation details.
Hardware Specification Yes All experiments were performed on a Linux machine with a single Titan V GPU.
Software Dependencies No The paper mentions 'GPy Torch' but does not provide a specific version number for it or any other software dependencies.
Experiment Setup Yes We ran all methods for 150 iterations with stochastic gradient descent and the library GPy Torch was used for DKL, DSDGP, SGPR, and SVGP.