Quantile Propagation for Wasserstein-Approximate Gaussian Processes

Authors: Rui Zhang, Christian Walder, Edwin V. Bonilla, Marian-Andrei Rizoiu, Lexing Xie

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on classification and Poisson regression show that QP outperforms both EP and variational Bayes.
Researcher Affiliation Academia 1The Australian National University 2CSIRO s Data61, Australia 3University of Technology Sydney 4The University of Sydney
Pseudocode Yes We summarize EP in algorithm 1 (Appendix). ... As summarized in Algorithm 1 (Appendix)
Open Source Code Yes Our code is publicly available 1. 1https://github.com/Rui Zhang2016/Quantile-Propagation-for-Wasserstein Approximate-Gaussian-Processes
Open Datasets Yes We perform binary classification experiments on the five real world datasets employed by Kuss and Rasmussen [28]: Ionosphere (Iono S), Wisconsin Breast Cancer, Sonar [13], Leptograpsus Crabs and Pima Indians Diabetes [48]. We use two additional UCI datasets as further evidence: Glass and Wine [13]. ... [13] Dua, D. and Graff, C. (2017). UCI Machine Learning Repository.
Dataset Splits Yes In the experiments, we randomly split each dataset into 10 folds, each time using 1 fold for testing and the other 9 folds for training, with features standardized to zero mean and unit standard deviation.
Hardware Specification No The paper states: 'This research was undertaken with the assistance of resources from the National Computational Infrastructure (NCI Australia), an NCRIS enabled capability supported by the Australian Government.' However, it does not specify concrete hardware details such as GPU/CPU models or memory.
Software Dependencies No The paper mentions: 'The implementations of EP and VB in Python are publicly available [18]', and refers to 'GPy: A Gaussian process framework in python'. While GPy is named, no specific version number for GPy or Python is provided.
Experiment Setup Yes For both EP and QP, we stop local updates, i.e., the inner loop in Algorithm 1 (Appendix), when the root mean squared change in parameters is less than 10 6. In the outer loop, the GP hyper-parameters are optimized by L-BFGS-B [6] with a maximum of 103 iterations and a relative tolerance of 10 9 for the function value. VB is also optimized by L-BFGS-B with the same configuration. Parameters shared by the three methods are initialized to be the same.