Estimating Identifiable Causal Effects on Markov Equivalence Class through Double Machine Learning
Authors: Yonghan Jung, Jin Tian, Elias Bareinboim
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Simulation results corroborate with the theory. and Experimental studies corroborate with the theory. and 6. Experiments We evaluate DML-IDP for estimating Px(y) in Fig. (2a,2b,1). We specify an SCM M for each PAG and generate datasets D from M. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Purdue University, USA 2Department of Computer Science, Iowa State University, USA 3Department of Computer Science, Columbia University, USA. |
| Pseudocode | Yes | Algorithm 1 IFP(x, y, G(V), P) |
| Open Source Code | No | The paper does not provide any explicit statement or link regarding the public availability of its source code. |
| Open Datasets | No | We specify an SCM M for each PAG and generate datasets D from M. and All values in the following tables denote parameters for beta distributions, and we sample from each variable according to the parameters. (No concrete access information for a publicly available or open dataset is provided.) |
| Dataset Splits | No | We generate 100 datasets for each sample size N. (No specific dataset splits like percentages or absolute counts for training, validation, and test sets are mentioned.) |
| Hardware Specification | No | We use a single CPU core for XGBoost training and prediction. (No specific hardware details like CPU model, GPU model, or memory specifications are provided.) |
| Software Dependencies | No | Nuisance functions are estimated using standard techniques available in the literature (refer to Appendix C for details), e.g., conditional probabilities are estimated using a gradient boosting model XGBoost (Chen & Guestrin, 2016) and The nuisance functions are estimated with XGBoost (Chen & Guestrin, 2016). (XGBoost is mentioned, but no specific version number is provided.) |
| Experiment Setup | Yes | XGBoost is set with default parameters with max_depth=6, subsample=0.8, colsample_bytree=0.8, and n_estimators=100. We use a single CPU core for XGBoost training and prediction. Each experiment is repeated 100 times, and we compute the average AAE. |