Combinatorial Bayesian Optimization using the Graph Cartesian Product
Authors: Changyong Oh, Jakub Tomczak, Efstratios Gavves, Max Welling
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We validate COMBO in a wide array of realistic benchmarks, including weighted maximum satisfiability problems and neural architecture search. COMBO outperforms consistently the latest state-of-the-art while maintaining computational and statistical efficiency. and 4 Experiments |
| Researcher Affiliation | Collaboration | 1 University of Amsterdam 2 Qualcomm AI Research 3 CIFAR |
| Pseudocode | Yes | Algorithm 1 COMBO: Combinatorial Bayesian Optimization on the combinatorial graph |
| Open Source Code | Yes | The code is available at: https://github.com/QUVA-Lab/COMBO |
| Open Datasets | Yes | The Branin benchmark is an optimization problem of a non-linear function over a 2D search space [21]., We run tests on three benchmarks from the Maximum Satisfiability Competition 2018.3, The objective is to minimize the classification error on validation set of CIFAR10 [26] |
| Dataset Splits | No | While the paper mentions using a 'validation set of CIFAR10', it does not provide explicit percentages, sample counts, or specific methodology for train/validation/test splits for any of the datasets used in its experiments. |
| Hardware Specification | Yes | The runtime including evaluation time was measured on a dual 8-core 2.4 GHz (Intel Haswell E5-2630-v3) CPU with 64 GB memory using Python implementations. and The all runtimes were measured on Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz with python codes. |
| Software Dependencies | No | The paper mentions 'Python implementations' but does not specify version numbers for Python or any other software libraries used in the experiments. |
| Experiment Setup | Yes | Sampling begins with 100 steps of the burn-in phase. With the updated D of evaluated data, 10 points are sampled without thinning. and For this purpose, we begin with evaluating 20,000 randomly selected vertices. Twenty vertices with highest acquisition values are used as initial points for acquisition function optimization. |