Scalable Gaussian Process Separation for Kernels with a Non-Stationary Phase
Authors: Jan Graßhoff, Alexandra Jankowski, Philipp Rostalski
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our approach is demonstrated on numerical examples and large spatio-temporal biomedical problems. We demonstrate scalability of our proposed method to n 10^5 points on a numerical example and on openly available biomedical datasets. |
| Researcher Affiliation | Academia | Jan Graßhoff 1 Alexandra Jankowski 1 Philipp Rostalski 1 1Institute for Electrical Engineering in Medicine, Universit at zu L ubeck, Germany. Correspondence to: Jan Graßhoff <j.grasshoff@uni-luebeck.de>. |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. The methods are described in text and mathematical formulations. |
| Open Source Code | Yes | Code implementations of the proposed warp SKI in MATLAB are provided as an extension to the GPML 4.2 toolbox (Rasmussen & Nickisch, 2010) and are available under github.com/ime-luebeck/ non-stationary-phase-gp-mod. |
| Open Datasets | Yes | We validate the proposed warp SKI on data taken from the Physionet fetal ECG database (Jezewski et al., 2012)... The considered dataset of a spontaneously breathing neonate is taken from Heinrich et al. (2006)... The data for this problem were taken from (Aras et al., 2015)... |
| Dataset Splits | No | The paper describes using subsets of data for evaluation and training, and specific ways of testing (e.g., predicting the next frame, removing electrodes). However, it does not explicitly provide specific training/test/validation dataset splits (e.g., exact percentages or counts for each partition), nor does it formally define a separate validation set for hyperparameter tuning distinct from training and testing. |
| Hardware Specification | Yes | all experiments were carried out on a workstation with an INTEL Core i7-6700K CPU. |
| Software Dependencies | Yes | Code implementations of the proposed warp SKI in MATLAB are provided as an extension to the GPML 4.2 toolbox (Rasmussen & Nickisch, 2010)... |
| Experiment Setup | Yes | In all experiments, L-BFGS (Liu & Nocedal, 1989) was used for hyperparameter learning, respectively with a maximum of 100 optimization steps. The tolerance for conjugate gradients was set to 10^-1, and marginal likelihood evaluations were done using 20 probe vectors in the stochastic trace estimation. |