Efficient learning of smooth probability functions from Bernoulli tests with guarantees

Authors: Paul Rolland, Ali Kavis, Alexander Immer, Adish Singla, Volkan Cevher

ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical results show that the empirical convergence rates match the theory, and illustrate the superiority of our approach in handling contextual features over the state-of-the-art.
Researcher Affiliation Academia 1Ecole Polytechnique F ed erale de Lausanne, Switzerland 2Max Planck Institute for Software Systems, Saarbr ucken, Germany.
Pseudocode Yes Algorithm 1 Smooth Beta Process (SBP) Algorithm 2 Inference engine for the simplified dynamic setting: Constant A, B Algorithm 3 Contextual Smooth Beta Process (CSBP)
Open Source Code No The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available.
Open Datasets No The paper uses synthetic data generated for the experiments ("We construct a function π : X [0, 1], uniformly select points {xi}i=1,...,t, and sample si Bernoulli(π(xi))" and "We construct a synthetic dataset by uniformly sampling exercise difficulties x and fatigue levels αf"). It does not use or provide access information for a pre-existing, publicly available dataset.
Dataset Splits No The paper conducts numerical experiments using synthetic data and evaluates L2 errors. However, it does not explicitly mention standard training, validation, and test dataset splits with specific percentages or counts. The evaluation is performed over "all points x X" or on generated samples.
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., GPU models, CPU types, memory) used to run the experiments.
Software Dependencies No The paper does not list specific software dependencies with version numbers (e.g., Python version, library versions) that would be needed to replicate the experiments.
Experiment Setup Yes We construct a function π : X [0, 1], uniformly select points {xi}i=1,...,t, and sample si Bernoulli(π(xi)), i = 1, ..., t. From these data, SBP constructs the posterior distributions π(x|S) x X. This experiment is performed both in 1D setting using a feature space X = [0, 1], and in 2D with X = [0, 1]2. Explicit forms of the chosen functions are presented in the Appendix. For the dynamic setting, contextual parameters {Bi}i=1,...,t are sampled independently and uniformly from [0, 1], and the tests are then performed by sampling si Bernoulli((1 Bi)π(xi) + Bi), i = 1, ..., t. The posterior is constructed using CSBP. We also applied LGP to this dynamic setting by including the parameter B as an additional feature. In order to evaluate π, LGP returns the approximated distribution associated with B = 0. We run SBP with fixed kernel widths 1 = 50 1 d+2 and 2 = 500000 1 d+2.