Fast Kronecker Inference in Gaussian Processes with non-Gaussian Likelihoods

Authors: Seth Flaxman, Andrew Wilson, Daniel Neill, Hannes Nickisch, Alex Smola

ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate our methods on synthetic and real data, focusing on runtime and accuracy for inference and hyperparameter learning.
Researcher Affiliation Collaboration 1Carnegie Mellon University, 2Philips Research Hamburg, 3Marianas Labs
Pseudocode Yes Pseudocode for our algorithm is shown in Algorithm 1.
Open Source Code Yes We have implemented code as part of the GPML toolbox (Rasmussen & Nickisch, 2010). See http://www.cs.cmu.edu/ andrewgw/pattern for updates and demos.
Open Datasets Yes The City of Chicago makes geocoded, date-stamped crime report data publicly available through its data portal3. 3http://data.cityofchicago.org
Dataset Splits Yes We used 5-fold crossvalidation, relearning the hyperparameters for each fold and making predictions for the latent function values fi on the 20% of data that was held out.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory, or cloud instance types) used for running its experiments.
Software Dependencies No The paper mentions implementing code as part of the 'GPML toolbox (Rasmussen & Nickisch, 2010)' but does not provide specific version numbers for this toolbox or any other software dependencies.
Experiment Setup Yes We ran non-linear conjugate gradient descent for 200 iterations. For hyperparameter learning, our spatial grid was 17 26, corresponding to 1 mile by 1 mile grid cells, and our temporal grid was one cell per week, for a total of 416 weeks.