Learning Bayesian networks with ancestral constraints

Authors: Eunice Yuh-Jie Chen, Yujia Shen, Arthur Choi, Adnan Darwiche

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, we demonstrate that our approach can be orders-of-magnitude more efficient than alternative frameworks, such as those based on integer linear programming.
Researcher Affiliation Academia Eunice Yuh-Jie Chen and Yujia Shen and Arthur Choi and Adnan Darwiche Computer Science Department University of California Los Angeles, CA 90095 {eyjchen,yujias,aychoi,darwiche}@cs.ucla.edu
Pseudocode No The paper describes algorithms and methods in prose but does not provide any explicitly labeled pseudocode or algorithm blocks.
Open Source Code No The paper refers to and links to third-party tools (EVASOLVER, GOBNILP) but does not provide access to its own implementation code.
Open Datasets Yes We simulated different structure learning problems from standard Bayesian network benchmarks5 ALARM, ANDES, CHILD, CPCS54, and HEPAR2, by (1) taking a random sub-network N of a given size... The networks used in our experiments are available at http://www.bnlearn.com/bnrepository
Dataset Splits No The paper mentions 'simulating a training dataset' and refers to 'test cases' in the results tables, but it does not specify explicit training, validation, or test dataset splits or percentages.
Hardware Specification Yes Our experiments were run on a 2.67GHz Intel Xeon X5650 CPU.
Software Dependencies No The paper mentions using 'EVASOLVER partial Max SAT solver' and 'GOBNILP' with URLs, but does not provide specific version numbers for these or any other software dependencies used in their own implementation.
Experiment Setup Yes We assumed BDeu scores with an equivalent sample size of 1. We further pre-computed the scores of candidate parent sets, which were fed as input into each system evaluated.