High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds

Authors: Noémie Jaquier, Leonel Rozo

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test our approach in several benchmark artificial landscapes and report that it not only outperforms other high-dimensional BO approaches in several settings, but consistently optimizes the objective functions, as opposed to geometry-unaware BO methods.
Researcher Affiliation Collaboration Noémie Jaquier1,2 1Idiap Research Institute 1920 Martigny, Switzerland noemie.jaquier@kit.edu Leonel Rozo2 2Bosch Center for Artificial Intelligence 71272 Renningen, Germany leonel.rozo@de.bosch.com
Pseudocode Yes Algorithm 1: HD-Ga BO
Open Source Code Yes Source code is available at https://github. com/Noemie Jaquier/Ga BOtorch.
Open Datasets Yes We consider benchmark test functions defined on a low-dimensional manifold Md embedded in a high-dimensional manifold MD. Therefore, the test functions are defined as f : MD R, so that y = f(m(x)) with m : MD Md being the nested projection mapping, as defined in Section 3.3. The projection mapping parameters are randomly set for each trial. The search space corresponds to the complete manifold for SD and to SPD matrices with eigenvalues λ [0.001, 5] for SD ++. We carry out the optimization by running 30 trials with random initialization.
Dataset Splits No The paper describes experimental setup and evaluation metrics (e.g., '30 trials with random initialization', '300 BO iterations', 'median of the logarithm of the simple regret') but does not specify explicit train/validation/test dataset splits with percentages or sample counts.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory, or specific computer specifications) used for running the experiments.
Software Dependencies No The paper mentions software libraries used ('GPy Torch', 'Bo Torch', 'Pymanopt') but does not provide specific version numbers for these dependencies.
Experiment Setup Yes All the tested methods use EI as acquisition function and are initialized with 5 random samples. The GP parameters are estimated using MLE.