Generalizing Bayesian Optimization with Decision-theoretic Entropies

Authors: Willie Neiswanger, Lantao Yu, Shengjia Zhao, Chenlin Meng, Stefano Ermon

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate our proposed method on the example tasks described in Section 5: top-k optimization with diversity, multi-level set estimation, and sequence search. For these applications, we show comparisons against a set of baselines on real and synthetic black-box functions.
Researcher Affiliation Academia Computer Science Department, Stanford University Stanford, CA 94305 {neiswanger,lantaoyu,sjzhao,chenlin,ermon}@cs.stanford.edu
Pseudocode Yes Algorithm 1 H ,A-ENTROPY SEARCH
Open Source Code Yes All code and instructions are included in supplementary material.
Open Datasets Yes We also compare each method on the Vaccination function (provided by [53]), which returns the vaccination rate for locations in the continential United States, given an input (latitude, longitude). [...] In the top row, we compare the performance of all methods, showing the accuracy vs. iteration. Here, the Pennsylvania Night Light function [1] released by NASA (additional details in the appendix), returns the relative level of light at a location in Pennsylvania, as queried by a satellite image.
Dataset Splits No The paper does not provide specific dataset split information (exact percentages, sample counts, or detailed splitting methodology) for training, validation, or test sets.
Hardware Specification Yes Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [Yes] See appendix B.
Software Dependencies No The paper mentions 'GPy Torch [17] and Bo Torch [4]' but does not specify their version numbers or other software dependencies with versions.
Experiment Setup No The paper states 'All training details are specified in the paper and included code,' but does not provide specific hyperparameter values, training configurations, or system-level settings in the main text.