On the inability of Gaussian process regression to optimally learn compositional functions
Authors: Matteo Giordano, Kolyan Ray, Johannes Schmidt-Hieber
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We rigorously prove that deep Gaussian process priors can outperform Gaussian process priors if the target function has a compositional structure. To this end, we study information-theoretic lower bounds for posterior contraction rates for Gaussian process regression in a continuous regression model. |
| Researcher Affiliation | Academia | Matteo Giordano Department of Statistics University of Oxford matteo.giordano@stats.ox.ac.uk Kolyan Ray Department of Mathematics Imperial College London kolyan.ray@imperial.ac.uk Johannes Schmidt-Hieber Department of Applied Mathematics University of Twente a.j.schmidt-hieber@utwente.nl |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any statement about making source code for the described methodology publicly available, nor does it provide a link to a code repository. |
| Open Datasets | No | The paper is theoretical and does not conduct empirical studies with datasets, therefore it does not mention publicly available training datasets. |
| Dataset Splits | No | The paper is theoretical and does not conduct empirical studies, therefore it does not provide training/test/validation dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not provide specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with hyperparameters or training settings. |