Causal Discovery with Fewer Conditional Independence Tests
Authors: Kirankumar Shiragur, Jiaqi Zhang, Caroline Uhler
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we test our proposed method and compare it to existing causal discovery methods on synthetic data and a toy real-world example. |
| Researcher Affiliation | Academia | 1Eric and Wendy Schmidt Center, Broad Institute 2Laboratory for Information & Decision Systems, Massachusetts Institute of Technology. |
| Pseudocode | Yes | Algorithm 1 Learning a Prefix Vertex Set |
| Open Source Code | Yes | Source code for these results can be found at https://github.com/uhlerlab/CCPG. |
| Open Datasets | Yes | To illustrate the utility of the coarser representation learned by CCPG in real-world settings, we include a simple 6-variable Airfoil example (Asuncion & Newman, 2007; Lam et al., 2022) |
| Dataset Splits | No | The paper discusses generating samples and increasing their number to recover the true graph, but it does not specify explicit training, validation, or test dataset splits with percentages, counts, or predefined partition methodologies. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, memory, or cloud instance types used for running the experiments. |
| Software Dependencies | No | For FCI and RFCI, we used the implementations in Kalisch et al. (2024), which is written in R with C++ accelerations. For PC and GSP, we used the implementation in Squires, which us written in python. Our method, CCPG, is written in python. The paper mentions programming languages and implementations used but does not provide specific version numbers for these or any associated libraries. |
| Experiment Setup | No | The paper specifies details about the data generation (linear causal models with additive Gaussian noise, 100k samples, 10-node in-star-shaped DAG) but does not provide specific hyperparameter values, optimizer settings, or detailed training configurations typically found in an 'experimental setup' section for algorithm training. |