Efficient Bayesian network structure learning via local Markov boundary search

Authors: Ming Gao, Bryon Aragam

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct a brief simulation study to demonstrate the performance of Algorithm 1 and compare against some common baselines: PC [39], GES [7]. We focus on the fully discrete setting. All implementation details can be found in Appendix I. The code implementing TAM algorithm is available at https://github.com/Ming Gao97/TAM. We stress that the purpose of these experiments is simply to illustrate that the proposed algorithm can be implemented in practice, and successfully recovers the edges in G as predicted by our theoretical results.
Researcher Affiliation Academia Ming Gao The University of Chicago minggao@uchicago.edu Bryon Aragam The University of Chicago bryon@chicagobooth
Pseudocode Yes Algorithm 1 Learning DAG structure and Algorithm 2 Possible Parental Set (PPS) procedure are provided on pages 3 and 4, respectively.
Open Source Code Yes The code implementing TAM algorithm is available at https://github.com/Ming Gao97/TAM.
Open Datasets No We simulate DAGs from three graph types: Poly-trees (Tree) , Erdös-Rényi (ER), and Scale-Free (SF) graphs. ... We generate data according to two models satisfying the equal entropy condition (C3). The paper uses simulated data, not a publicly available dataset with concrete access information.
Dataset Splits No The paper conducts a simulation study where data is generated. It mentions sample sizes ('n (Sample size in thousands)') and dimensions ('d=10 d=20 d=30 d=40 d=50'), but does not specify any training, validation, or test dataset splits.
Hardware Specification No The paper describes a simulation study but does not provide any specific hardware details (e.g., GPU/CPU models, memory specifications) used for running the experiments.
Software Dependencies No The paper mentions comparing against PC [39] and GES [7] and states 'All implementation details can be found in Appendix I,' but does not provide specific software names with version numbers in the main text.
Experiment Setup No We simulate DAGs from three graph types: Poly-trees (Tree) , Erdös-Rényi (ER), and Scale-Free (SF) graphs. ... We generate data according to two models satisfying the equal entropy condition (C3). ... We evaluate the performance of aforementioned algorithms by Structural Hamming distance (SHD)...d=10 d=20 d=30 d=40 d=50, n (Sample size in thousands)... All implementation details can be found in Appendix I. While it describes general setup parameters, specific hyperparameters (e.g., learning rate, optimizer) are not explicitly stated in the main text.