Differential Semantics of Intervention in Bayesian Networks

Authors: Biao Qin

IJCAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To test our algorithm, we study its performance in real Bayesian networks. We implement our Full Do and Do based on BNJ [kdd, 2006]. The experiments were conducted on a PC with Intel core2, 1.8GHz, 4.0G memory and Linux. We compute the full atomic intervention of the real world Bayesian networks, which are selected from the benchmark [ace, 2003]. The statistics of these networks are summarized in Table 1, where P |Vi| denotes the number of values for all network variables in a Bayesian network and | m| represents the average number of values for each network variable. Since both Full Do and Do are based on the jointree algorithm, we first compile the jointree for them and use Compile to denote the compilation phase of Full Do and Do. We next compute the full atomic intervention of all nodes with respect to evidence variables and use Full Do-I and Do-I to represent the time for computing the full atomic intervention of all nodes in the outward phase by using our method and Pearl s method respectively. From Table 2, we can see that Full Do performs better than Do for all Bayesian networks.
Researcher Affiliation Academia Biao Qin School of Information, Renmin University of China Beijing, China qinbiao@ruc.edu.cn
Pseudocode Yes Algorithm 1 Full Do
Open Source Code No The paper mentions 'BNJ [kdd, 2006]' and references 'http://sourceforge.net/projects/bnj/. 2006.', which is a third-party tool used. It does not provide a link or an explicit statement about the release of the authors' own code for the methodology described in the paper.
Open Datasets Yes We compute the full atomic intervention of the real world Bayesian networks, which are selected from the benchmark [ace, 2003]. The statistics of these networks are summarized in Table 1... [ace, 2003] http://reasoning.cs.ucla.edu/ace. 2003.
Dataset Splits No The paper mentions using benchmark Bayesian networks but does not provide details on how these networks were split into training, validation, or test sets. It focuses on inference computation on the networks themselves rather than model training with explicit data splits.
Hardware Specification Yes The experiments were conducted on a PC with Intel core2, 1.8GHz, 4.0G memory and Linux.
Software Dependencies No The paper states: 'We implement our Full Do and Do based on BNJ [kdd, 2006]'. While BNJ is identified, a specific version number for BNJ (e.g., BNJ vX.Y) is not given, nor are specific versions for 'Linux' or any other libraries.
Experiment Setup No The paper describes the algorithms and their performance in terms of execution time but does not provide specific experimental setup details such as hyperparameters, learning rates, or optimizer settings, as is common in machine learning papers. This is an inference-focused paper where such details are less relevant.