Infinite-Horizon Gaussian Processes

Authors: Arno Solin, James Hensman, Richard E. Turner

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We provide extensive evaluation of the IHGP both in terms of simulated benchmarks and four real-world experiments in batch and online modes.
Researcher Affiliation Collaboration Arno Solin Aalto University arno.solin@aalto.fi James Hensman PROWLER.io james@prowler.io Richard E. Turner University of Cambridge ret26@cam.ac.uk
Pseudocode Yes Algorithm 1 Infinite-horizon Gaussian process (IHGP) inference. The GP prior is specified in terms of a state space model. After the setup cost on line 2, all operations are at most O(m2).
Open Source Code Yes Code implementations in MATLAB/C++/Objective-C and video examples of real-time operation are available at https://github.com/AaltoML/IHGP.
Open Datasets Yes Coal mining disasters dataset: The data (available, e.g., in [35]) contain the dates of 191 coal mine explosions that killed ten or more people in Britain between years 1851 1962, which we discretize into n = 200 bins. Airline accident dataset: As a more challenging regression problem we explain the time-dependent intensity of accidents and incidents of commercial aircraft. The data [22] consists of dates of 1210 incidents over the time-span of years 1919 2017. Electricity consumption: We do explorative analysis of electricity consumption for one household [9] recorded every minute (in log k W) over 1,442 days (n = 2,075,259, with 25,979 missing observations).
Dataset Splits No The paper describes total dataset sizes for various experiments (e.g., n=1000, n=200, n=2,075,259) and refers to 'train', 'validation', and 'test' as fields in the schema. However, it does not provide explicit percentages or sample counts for training, validation, and test dataset splits to reproduce the partitioning of the data.
Hardware Specification Yes Experiments run in Mathworks MATLAB (R2017b) on an Apple Mac Book Pro (2.3 GHz Intel Core i5, 16 Gb RAM). In the final experiment we implement the IHGP in C++ with wrappers in Objective-C for running as an app on an Apple i Phone 6s (i OS 11.3).
Software Dependencies Yes Experiments run in Mathworks MATLAB (R2017b) on an Apple Mac Book Pro (2.3 GHz Intel Core i5, 16 Gb RAM). In the final experiment we implement the IHGP in C++ with wrappers in Objective-C for running as an app on an Apple i Phone 6s (i OS 11.3).
Experiment Setup Yes The data were simulated from yi = sinc(xi 6) + εi, εi N(0, 0.1). (Section 4.1) For non-Gaussian inference we set up an EP [5, 12, 19] scheme which only requires one forward pass (assumed density filtering, see also unscented filtering [27]). (Section 3.2) We formulate the hyperparameter optimisation problem as an incremental gradient descent... η is a learning-rate (step-size) parameter (Section 3.3). Specific values are given for electricity consumption and iPhone app: nmb = 14,400, η = 0.001, window step size of 1 hr (Section 4.3) We fix the measurement noise to σ2 n = 1 and use separate learning rates η = (0.1, 0.01) in online estimation of the magnitude scale and length-scale hyperparemeters. The GP is re-estimated every 0.1 s. (Section 4.4).