Randomly Projected Additive Gaussian Processes for Regression
Authors: Ian Delbridge, David Bindel, Andrew Gordon Wilson
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate our approach can achieve faster inference and improved predictive accuracy for high-dimensional inputs compared to kernels in the original input space. (...) We evaluate RPA-GP and DPA-GP on a wide array of regression tasks. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Cornell University, Ithaca, New York, USA 2Center for Data Science, New York University, New York City, New York, USA. |
| Pseudocode | No | The paper describes the proposed algorithms and methods using mathematical formulations and textual descriptions but does not include a clearly labeled pseudocode or algorithm block. |
| Open Source Code | Yes | We provide GPy Torch (Gardner et al., 2018) code at https://github.com/idelbrid/ Randomly-Projected-Additive-GPs. |
| Open Datasets | Yes | To evaluate RPA and DPA-GP, we compute the normalized RMSE and negative log likelihood for a large number of UCI data sets. (...) Following Wilson et al. (2016) and Hinton and Salakhutdinov (2008), we construct regression data sets of three different sizes from the Olivetti faces data set. |
| Dataset Splits | No | The paper mentions 'cross-validation' for choosing J, but does not specify exact training, validation, or test dataset splits (e.g., percentages, sample counts, or specific predefined split citations) needed for reproduction. |
| Hardware Specification | Yes | We run this experiment on a 1.8 GHz Intel i5 processor and 8 GB of RAM. |
| Software Dependencies | No | The paper states 'We implement all models using GPy Torch (Gardner et al., 2018)' but does not provide specific version numbers for GPy Torch or any other software dependencies like Python, PyTorch, or CUDA. |
| Experiment Setup | Yes | We train both RPA-GP with SKI and a GP with RBF kernel using Cholesky-based inference for 120 Adam iterations on synthetic data sets. We use RPA-GP with 20 1-dimensional projections and 512 inducing points per projection. |