Frank-Wolfe Bayesian Quadrature: Probabilistic Integration with Theoretical Guarantees

Authors: François-Xavier Briol, Chris Oates, Mark Girolami, Michael A. Osborne

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In simulations, FWBQ is competitive with state-of-the-art methods and out-performs alternatives based on Frank-Wolfe optimisation. Our approach is applied to successfully quantify numerical error in the solution to a challenging Bayesian model choice problem in cellular biology.
Researcher Affiliation Academia Department of Statistics University of Warwick, School of Mathematical and Physical Sciences University of Technology, Sydney, Department of Engineering Science University of Oxford
Pseudocode Yes Algorithm 1 The Frank-Wolfe (FW) and Frank-Wolfe with Line-Search (FWLS) Algorithms. Require: function J, initial state g1 = g1 G (and, for FW only: step-size sequence {ρi}n i=1). 1: for i = 2, . . . , n do 2: Compute gi = argming G g, (DJ)(gi 1) 3: [For FWLS only, line search: ρi = argminρ [0,1]J (1 ρ)gi 1 + ρ gi ] 4: Update gi = (1 ρi)gi 1 + ρi gi 5: end for
Open Source Code No The paper does not provide any specific links or explicit statements about the release of their implementation code.
Open Datasets No The paper mentions using "a 20-component mixture of 2D-Gaussian distributions" for the simulation study and applies FWBQ to "one of the model selection tasks in [19]". While [19] is cited, the paper itself does not provide concrete access information (link, DOI, repository) for the specific data or simulation setup used in this paper, nor does it refer to a widely recognized public dataset with clear access.
Dataset Splits No The paper describes simulation setups and application details (e.g., "n = 10 design points"), but it does not specify any train/validation/test splits for any dataset.
Hardware Specification No The paper does not specify any hardware details (e.g., CPU, GPU models, memory, specific cloud instances) used for running the experiments.
Software Dependencies No The paper mentions using an "exponentiated-quadratic (EQ) kernel" and "Bayesian Quadrature", but does not list specific software libraries or their version numbers used for implementation (e.g., Python, PyTorch, TensorFlow, SciPy, etc. with versions).
Experiment Setup Yes Here, the same kernel hyper-parameters (λ, σ) = (1, 0.8) were employed for all methods to have a fair comparison.