A Bayesian Take on Gaussian Process Networks

Authors: Enrico Giudice, Jack Kuipers, Giusi Moffa

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Simulation studies show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network and provides an accurate approximation of its posterior distribution. (...) In Sections 4 and 5 we evaluate and compare our algorithm to existing approaches on simulated and real data.
Researcher Affiliation Academia Enrico Giudice Dep. of Mathematics and Computer Science University of Basel, Basel, Switzerland (...) Jack Kuipers Dep. of Biosystems Science and Engineering ETH Zurich, Basel, Switzerland (...) Giusi Moffa Dep. of Mathematics and Computer Science, University of Basel, Basel, Switzerland and Division of Psychiatry, University College London, London, UK
Pseudocode Yes Algorithm 1 GP network sampling scheme
Open Source Code Yes R code to implement Algorithm 1 and reproduce the results in Sections 4 and 5 is available at https://github.com/enricogiudice/Learning GPNs.
Open Datasets Yes We evaluate the Bayesian GP network inference scheme on data generated from known random networks with n = 10 nodes. (...) We also applied the GP score-based structure inference to the flow cytometry dataset of Sachs et al. [39] to learn protein signaling pathways.
Dataset Splits No The paper mentions evaluating on simulated and real data, but it does not explicitly provide specific details on training, validation, or test dataset splits (e.g., percentages, counts, or explicit standard split names for these phases).
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., GPU models, CPU types, memory) used for running the experiments.
Software Dependencies Yes Bayesian inference and optimization of the hyperparameters were performed via the Stan interface RStan [43]. (...) The library offers a highly efficient C++ implementation of the No U-turn sampler [22]. We performed MC estimation of the marginal likelihood via the bridgesampling package [19] (...). Given its good performance in benchmarking studies [37], we use the recent Bi DAG [45] hybrid implementation for MCMC inference on the graph. (...) RStan: the R interface to Stan, R package version 2.21.8, 2023.
Experiment Setup Yes For the bridge sampling estimator (12) of the marginal likelihood we used N1 = N2 = 300 particles from the proposal and posterior distribution over the hyperparameters. The proposal function g was set to a normal distribution, with its first two moments chosen to match those of the posterior distribution. (...) For every node Xi in each randomly generated network, we then sample 100 observations as a non-linear function of its parents.