Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)
Authors: Andrew Wilson, Hannes Nickisch
ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate SKI for kernel matrix approximation (section 4.1), kernel learning (section 4.2), and natural sound modelling (section 4.3). |
| Researcher Affiliation | Collaboration | Andrew Gordon Wilson ANDREWGW@CS.CMU.EDU Carnegie Mellon University Hannes Nickisch HANNES@NICKISCH.ORG Philips Research Hamburg |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | We have implemented code as an extension to the GPML toolbox (Rasmussen & Nickisch, 2010). For updates and demos, see http://www.cs.cmu.edu/ andrewgw/pattern |
| Open Datasets | Yes | We use SKI to model the natural sound time series in Fig 3(a), considered in a different context by Turner (2010). |
| Dataset Splits | No | The paper mentions |
| Hardware Specification | Yes | All experiments were performed on a 2011 Mac Book Pro, with an Intel i5 2.3 GHz processor and 4 GB of RAM. |
| Software Dependencies | No | The paper mentions the |
| Experiment Setup | Yes | For SKI, we use cubic interpolation and a 100 100 inducing point grid, equispaced in each input dimension. That is, we have as many inducing points m = 10, 000 as we have training datapoints. We use the same θ initialisation for each approach. |