Re-Examining Linear Embeddings for High-Dimensional Bayesian Optimization

Authors: Ben Letham, Roberto Calandra, Akshara Rai, Eytan Bakshy

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We show empirically that properly addressing these issues significantly improves the efficacy of linear embeddings for BO on a range of problems, including learning a gait policy for robot locomotion.
Researcher Affiliation Industry Benjamin Letham Facebook Menlo Park, CA bletham@fb.com Roberto Calandra Facebook AI Research Menlo Park, CA rcalandra@fb.com Akshara Rai Facebook AI Research Menlo Park, CA akshararai@fb.com Eytan Bakshy Facebook Menlo Park, CA ebakshy@fb.com
Pseudocode Yes Algorithm 1: ALEBO for linear embedding BO.
Open Source Code Yes Code to reproduce the results of this paper is available at github.com/facebookresearch/alebo.
Open Datasets Yes Constrained Neural Architecture Search We evaluated ALEBO performance on constrained neural architecture search (NAS) for convolutional neural networks using models from NAS-Bench-101 [53]. The NAS problem was to design a cell topology defined by a DAG with 7 nodes and up to 9 edges, which includes designs like Res Net [20] and Inception [47]. We created a D = 36 parameterization, producing a HDBO problem. The objective was to maximize CIFAR-10 test-set accuracy, subject to a constraint that training time was less than 30 mins; see Sec. S9 for full details.
Dataset Splits No No specific dataset split percentages or absolute sample counts for train/validation/test sets are provided for the main experiments, nor clear citations to predefined splits. The text mentions '100 training and 50 test points were randomly sampled' for a specific figure (Fig 3) but not for the overall experimental setup.
Hardware Specification No No specific hardware details (GPU/CPU models, memory, etc.) are provided for running the experiments.
Software Dependencies No The paper mentions Py Bullet [8] and Scipy's SLSQP but does not provide version numbers for these or other software dependencies.
Experiment Setup No While Algorithm 1 mentions `ninit` and `n BO`, their specific values for the experiments are not provided in the main text. Hyperparameters like learning rates, batch sizes, or optimizer settings are not detailed.