Interventional Causal Discovery in a Mixture of DAGs

Authors: Burak Varıcı, Dmitriy Katz, Dennis Wei, Prasanna Sattigeri, Ali Tajer

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate the performance of Algorithm 1 for estimating the true edges in a mixture of DAGs using synthetic data and investigate the need for interventions, the effect of the graph size, and the cyclic complexity.
Researcher Affiliation Collaboration Burak Varıcı Carnegie Mellon University Dmitriy A. Katz IBM Research Dennis Wei IBM Research Prasanna Sattigeri IBM Research Ali Tajer Rensselaer Polytechnic Institute
Pseudocode Yes Algorithm 1 Causal Discovery from Interventions on Mixture Models (CADIM)
Open Source Code Yes The codebase for the experiments can be found at https://github.com/bvarici/intervention-mixture-DAG.
Open Datasets No We use an Erd os-Rényi model G(n, p) with density p = 2/n to generate the component DAGs {Gℓ: ℓ [K]} for different values of nodes n and mixture components K. We adopt linear structural equation models (SEMs) with Gaussian noise for the causal models...
Dataset Splits No We look into the performance of Algorithm 1 under a varying number of nodes n [5, 30] for a mixture of K = 3 DAGs and using 5000 samples from each DAG. No explicit mention of train/validation/test splits is provided.
Hardware Specification Yes Experiments are run on a single commercial CPU.
Software Dependencies No The paper mentions using a “partial correlation test” but does not provide specific version numbers for software dependencies or libraries used.
Experiment Setup Yes We use an Erd os-Rényi model G(n, p) with density p = 2/n to generate the component DAGs... We adopt linear structural equation models (SEMs) with Gaussian noise for the causal models, in which the noise for node i is sampled from N(µi, σ2 i ) where µi is sampled uniformly in [ 1, 1] and σ2 i is sampled uniformly in [0.5, 1.5]. The edge weights are sampled uniformly in [0.25, 2].