Structure Learning with Adaptive Random Neighborhood Informed MCMC
Authors: Xitong Liang, Alberto Caron, Samuel Livingstone, Jim Griffin
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | After introducing the technical novelties in PARNI-DAG, we empirically demonstrate its mixing efficiency and accuracy in learning DAG structures on a variety of experiments.1 |
| Researcher Affiliation | Academia | Alberto Caron The Alan Turing Institute London, UK acaron@turing.ac.ukXitong Liang Department of Statistical Sciences University College London London, UK xitong.liang.18@ucl.ac.ukSamuel Livingstone Department of Statistical Sciences University College London London, UK samuel.livingstone@ucl.ac.ukJim Griffin Department of Statistical Sciences University College London London, UK j.griffin@ucl.ac.uk |
| Pseudocode | Yes | The full details and algorithmic pseudo code of PARNI-DAG are also provided in the supplementary material (Appendix D). |
| Open Source Code | Yes | 1Code to implement the PARNI-DAG proposal and replicate the experimental sections is available at https://github.com/Xitong Liang/The-PARNI-scheme/tree/main/Structure-Learning. |
| Open Datasets | Yes | We first consider the real-world protein-signalling dataset [Sachs et al., 2005], found also in Cundy et al. [2021], to test PARNI-DAG s mixing. |
| Dataset Splits | No | The paper mentions generating N=100 observations but does not provide specific details on train/validation/test splits, percentages, or explicit sample counts for the experimental setup. |
| Hardware Specification | Yes | Using Intel i7 2.80 GHz processor |
| Software Dependencies | No | The paper mentions 'R package bnlearn' but does not provide specific version numbers for it or any other software dependencies. |
| Experiment Setup | Yes | For ADR and PARNI-DAG, we use prior parameters g = 10 and h = 1/11. |