Convergence Guarantees for Adaptive Bayesian Quadrature Methods

Authors: Motonobu Kanagawa, Philipp Hennig

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, for a broad class of adaptive Bayesian quadrature methods, we prove consistency, deriving non-tight but informative convergence rates. To do so we introduce a new concept we call weak adaptivity. Our results identify a large and flexible class of adaptive Bayesian quadrature rules as consistent, within which practitioners can develop empirically efficient methods.
Researcher Affiliation Academia Motonobu Kanagawa ,# and Philipp Hennig# EURECOM, Sophia Antipolis, France #University of Tübingen and Max Planck Institute for Intelligent Systems, Tübingen, Germany motonobu.kanagawa@eurecom.fr & philipp.hennig@uni-tuebingen.de
Pseudocode No The paper does not contain any pseudocode or algorithm blocks. Figure 1 illustrates the relationships between auxiliary results and main results, not an algorithm.
Open Source Code No The paper does not provide any statements or links regarding the availability of open-source code for the described methodology.
Open Datasets No This paper is theoretical and does not use any specific publicly available datasets for empirical training or evaluation. The mention of
Dataset Splits No This paper is theoretical and does not describe any empirical experiments or dataset splits for training, validation, or testing.
Hardware Specification No This paper is theoretical and does not describe any experiments; therefore, no hardware specifications are provided.
Software Dependencies No This paper is theoretical and does not describe any experiments; therefore, no software dependencies with specific version numbers are listed.
Experiment Setup No This paper is theoretical and does not describe any experiments or their setup, including hyperparameters or training configurations.