Bayesian Optimisation with Unknown Hyperparameters: Regret Bounds Logarithmically Closer to Optimal

Authors: Juliusz Ziomek, Masaki Adachi, Michael A Osborne

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We also empirically evaluate our algorithm on synthetic and real-world benchmarks and show it outperforms A-GP-UCB, maximum likelihood estimation and MCMC. We now evaluate the performance of our algorithm on multiple synthetic and real-world functions.
Researcher Affiliation Collaboration Juliusz Ziomek , , Masaki Adachi , , Michael A. Osborne , Machine Learning Research Group, University of Oxford Toyota Motor Corporation Corresponding Author {juliusz, masaki, mosb}@robots.ox.ac.uk
Pseudocode Yes In Algorithm 1, we present LB-GP-UCB, an algorithm employing this mechanism. We present the pseudo-code of that procedure in Algorithm 2. In Algorithm 3 we present Length scale and Bound Balancing (LNB) an algorithm.
Open Source Code Yes We open-source our code2. (Footnote 2: https://github.com/JuliuszZiomek/LB-GP-UCB)
Open Datasets Yes We start with a one-dimensional toy problem proposed by the same paper that proposed the A-GP-UCB algorithm [6]. As a next benchmark, we evaluate our algorithm on the five- dimensional Michalewicz synthetic function. We utilise material design tasks proposed by [17] and [30] the 4-dimensional Crossed Barrel and 5-dimensional AGNP tasks.
Dataset Splits No The paper does not specify explicit training, validation, and test splits with percentages or sample counts for the datasets used in its experiments.
Hardware Specification Yes To run all experiments we used a machine with AMD Ryzen Threadripper 3990X 64-Core Processor and 252 GB of RAM. No GPU was needed to run the experiments. We were running multiple runs in parallel. To complete one run of each method we allocated four CPU cores.
Software Dependencies No The paper mentions using 'Bo Torch package [5, 34]' and 'Pyro [7]' but does not provide specific version numbers for these software libraries.
Experiment Setup Yes We used the UCB acquisition function and we compared different techniques for selecting the length scale value. For all experiments, we used isotropic ν-Matérn kernel with ν = 2.5. To achieve a fair comparison, we used the same growth function g(t) = max{t0, t} for both LB-GP-UCB and A-GP-UCB across all experiments, where t0 was selected so that at least 5 candidates are generated for g(1). We used 10 initial points for each algorithm unless specified otherwise.