Bayesian Optimization with Exponential Convergence

Authors: Kenji Kawaguchi, Leslie Pack Kaelbling, Tomás Lozano-Pérez

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we compare the IMGPO algorithm with the SOO, Ba MSOO, GP-PI and GP-EI algorithms [18, 2, 3]. [...] The experimental results for eight different objective functions are shown in Figure 2. The vertical axis is log10(f(x*) - f(x+)), where f(x*) is the global optima and f(x+) is the best value found by the algorithm. [...] As we can see from Figure 2, IMGPO outperformed the other algorithms in general. [...] As can be seen in Table 1, IMGPO is much faster than traditional GP optimization methods although it is slower than SOO.
Researcher Affiliation Academia Kenji Kawaguchi MIT Cambridge, MA, 02139 kawaguch@mit.edu Leslie Pack Kaelbling MIT Cambridge, MA, 02139 lpk@csail.mit.edu Tom as Lozano-P erez MIT Cambridge, MA, 02139 tlp@csail.mit.edu
Pseudocode Yes Algorithm 1 Infinite-Metric GP Optimization (IMGPO)
Open Source Code Yes The source code of the proposed algorithm is publicly available at http://lis.csail.mit.edu/code/imgpo.html.
Open Datasets Yes The last five functions are standard benchmarks for global optimization [21]. The first two were used in [18] to test SOO... The experimental results for eight different objective functions are shown in Figure 2. [21] S. Surjanovic and D. Bingham. Virtual library of simulation experiments: Test functions and datasets. Retrieved November 30, 2014, from http://www.sfu.ca/~ssurjano, 2014.
Dataset Splits No The paper describes experiments on various objective functions but does not specify any training, validation, or test dataset splits, nor does it mention cross-validation. The functions are used for evaluation directly.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments. It does not mention any GPU or CPU models, or other hardware specifications.
Software Dependencies No The paper mentions various algorithms and methods used (e.g., 'isotropic Matern kernel with ν = 5/2', 'SOO algorithm', 'empirical Bayesian method'), but it does not specify any software packages or libraries with their version numbers that would be needed to replicate the experiment.
Experiment Setup Yes Then, we blindly initialized the hyperparameters to σ = 1 and l = 0.25 for all the experiments; these values were updated with an empirical Bayesian method after each iteration. To compute the UCB by GP, we used η = 0.05 for IMGPO and Ba MSOO. For IMGPO, Ξmax was fixed to be 22 (the effect of selecting different values is discussed later). For Ba MSOO and SOO, the parameter hmax was set to n, according to Corollary 4.3 in [18].