Domain Invariant Learning for Gaussian Processes and Bayesian Exploration
Authors: Xilong Zhao, Siyuan Bian, Yaoyun Zhang, Yuliang Zhang, Qinying Gu, Xinbing Wang, Chenghu Zhou, Nanyang Ye
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical experiments demonstrate the superiority of DIL-GP for predictions on several synthetic and realworld datasets. We further demonstrate the effectiveness of the DIL-GP Bayesian optimization method on a PID parameters tuning experiment for a quadrotor. |
| Researcher Affiliation | Collaboration | 1Shanghai Jiao Tong University, Shanghai, China 2Shanghai Aritificial Intelligence Laboratory, Shanghai, China |
| Pseudocode | Yes | Algorithm 1: Domain Invariant Learning for Gaussian Processes |
| Open Source Code | Yes | The full version and source code are available at: https://github.com/Billzxl/DILGP. |
| Open Datasets | Yes | The first dataset we consider is a regression dataset (Kaggle) of house sales prices from King County, USA. ... The second real-world dataset under consideration is the Automobile dataset from UCI datasets... |
| Dataset Splits | No | The paper specifies training and test set compositions for various datasets (e.g., 'Our training set consists of one hundred samples from cluster X1 and fifteen samples from cluster X2 while the test set consists of eighty samples from cluster X2.') but does not explicitly mention a separate validation set. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependency details, such as library names with version numbers (e.g., Python 3.8, PyTorch 1.9). |
| Experiment Setup | No | Detailed settings can be found in the Appendix. |