Objective Bound Conditional Gaussian Process for Bayesian Optimization
Authors: Taewon Jeong, Heeyoung Kim
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 6. Empirical Studies We evaluated the performance of BO with the OBCGP and compared it with that of BO with the GP. |
| Researcher Affiliation | Academia | Taewon Jeong 1 Heeyoung Kim 1 1Department of Industrial and Systems Engineering, KAIST, Daejeon, Republic of Korea. Correspondence to: Heeyoung Kim <heeyoungkim@kaist.ac.kr>. |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code for BO with the OBCGP is available at https://github.com/twj-KAIST/OBCGP-BO. |
| Open Datasets | Yes | We applied the OBCGP to optimize the hyperparamters in a multilayer perceptron (MLP) (Le Cun et al., 2015) to classify a popular MNIST dataset (Le Cun and Cortes, 2010). |
| Dataset Splits | Yes | We trained 60,000 images and tested 10,000 images using the MLP with two hidden layers of 100 and 50 hidden units, each stacked with a sigmoid activation function... We calculated the validation loss with the cross-entropy loss function... |
| Hardware Specification | No | The paper mentions training models but does not provide specific hardware details (e.g., exact GPU/CPU models, memory amounts, or detailed computer specifications) used for running its experiments. |
| Software Dependencies | No | The paper refers to using an MLP and the MNIST dataset, and provides a GitHub link for code, but does not explicitly list specific software dependencies with version numbers (e.g., 'PyTorch 1.9', 'Python 3.8'). |
| Experiment Setup | Yes | The regularization coefficient and the variance of the injected noise, together with the learning rate, were hyperparameters that were optimized using the OBCGP. The three hyperparameters were searched over the same region from 0.0001 to 0.1. |