Entropic Causal Inference: Graph Identifiability
Authors: Spencer Compton, Kristjan Greenewald, Dmitriy A Katz, Murat Kocaoglu
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We rigorously evaluate the performance of our algorithms on synthetic data generated from a variety of models, observing improvement over prior work. Finally we test our algorithms on real-world datasets. |
| Researcher Affiliation | Collaboration | 1Massachusetts Institute of Technology, Cambridge, USA 2MIT-IBM Watson AI Lab, Cambridge, USA 3IBM Research, Cambridge, USA 4Purdue University, West Lafayette, USA. |
| Pseudocode | Yes | Algorithm 1 Learning general graphs with oracle |
| Open Source Code | No | The paper refers to an external repository (bnlearn) which contains data, but does not provide a link or statement about open-sourcing the code for the methodology described in this paper. |
| Open Datasets | Yes | We also apply our algorithms on semi-synthetic data using the bnlearn1 repository and demonstrate the applicability of low-entropy assumptions and the proposed method. 1https://www.bnlearn.com/bnrepository/ |
| Dataset Splits | No | The paper does not provide explicit training/test/validation dataset splits (e.g., percentages or sample counts). It mentions using synthetic and real-world data but not how they are partitioned for different stages. |
| Hardware Specification | No | The paper does not explicitly describe the hardware used to run its experiments (e.g., specific GPU/CPU models, memory details). |
| Software Dependencies | No | The paper mentions 'bnlearn' and 'R' but does not provide specific version numbers for these or other software dependencies. |
| Experiment Setup | Yes | We evaluate performance via the structural Hamming distance (SHD) from the estimated graph to the true causal graph. See the Appendix for implementation details. The x axis shows entropy of the exogenous noise. The exogenous noise of the first variable is fixed to be large ( 3.3 bits), hence it is a high entropy source (HES). |