Local-to-Global Bayesian Network Structure Learning

Authors: Tian Gao, Kshitij Fadnis, Murray Campbell

ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test the algorithms on benchmark BN datasets from the BN repository1, using the datasets provided from existing works(Tsamardinos et al., 2006). We run the algorithms with 1000 samples of each dataset 10 times, and compare BDeu scores of each algorithm, along with the standard deviation, shown in Table 1. We also compare the algorithms on a synthetic 7-node network for DP as BNStruct Learn so algorithms can return results within the time limit. We report the running time (the entire algorithmic time, including data access, score computation and structure search) of each algorithm2, along with the standard deviation, shown in Table 2, with the maximum running time of 24 hours.
Researcher Affiliation Industry 1IBM Thomas J. Watson Research Center, Yorktown Heights, NY 10598 USA. Correspondence to: Tian Gao <tgao@us.ibm.com>.
Pseudocode Yes Algorithm 1 Local Learn; Algorithm 2 Graph Growing Structure Learning; Algorithm 3 GGSL Subroutine; Algorithm 4 update Graph
Open Source Code No The paper mentions using existing implementations of DP, CSL, and GOBNILP, and implementing their algorithms in MATLAB, but does not provide any link or explicit statement about making their own source code publicly available.
Open Datasets Yes We test the algorithms on benchmark BN datasets from the BN repository1, using the datasets provided from existing works(Tsamardinos et al., 2006). 1http://www.bnlearn.com/bnrepository/
Dataset Splits No The paper mentions using "1000 samples of each dataset 10 times" but does not specify how these samples were split into training, validation, or test sets for reproducibility purposes.
Hardware Specification Yes The experiments are conducted on a machine with Intel i5-3320M 2.6GHz with 8 GB memory.
Software Dependencies No The paper states, "We use the existing implementation of DP (in MATLAB), CSL (in C), and GOBNILP (in C), and implement our algorithms in MATLAB." However, it does not specify version numbers for MATLAB or any compilers/libraries used for C implementations.
Experiment Setup No The paper describes the general experimental process (e.g., running algorithms with 1000 samples 10 times) but does not provide specific hyperparameters (like learning rates, batch sizes, or optimizer settings) or detailed training configurations for their proposed method.