End-to-End Balancing for Causal Continuous Treatment-Effect Estimation
Authors: Taha Bahadori, Eric Tchetgen Tchetgen, David Heckerman
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Using synthetic and real-world data, we show that our proposed algorithm outperforms the entropy balancing in accuracy of treatment effect estimation. |
| Researcher Affiliation | Collaboration | 1Amazon.com, Inc. 2Wharton School of the University of Pennsylvania. Correspondence to: Mohammad Taha Bahadori <bahadorm@amazon.com>. |
| Pseudocode | Yes | Algorithm 1 Stochastic Training of ℓθ for End-to-End Balancing |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code for the methodology. |
| Open Datasets | Yes | We study the impact of PM2.5 particle level on the cardiovascular mortality rate (CMR) in 2132 counties in the US using the data provided by the National Studies on Air Pollution and Health (Rappold, 2020). The data is publicly available under U.S. Public Domain license. |
| Dataset Splits | Yes | All neural networks are trained using Adam (Kingma & Ba, 2014) with early stopping based on validation error. The learning rate and architectural parameters of the neural networks are tuned via hyperparameter search on the validation data. |
| Hardware Specification | Yes | We performed our experiments on a CPU machine with 16 cores from a cloud provider that uses hydroelectric power. |
| Software Dependencies | No | The paper mentions using PyTorch but does not specify its version number or any other software dependencies with version numbers. |
| Experiment Setup | Yes | All neural networks are trained using Adam (Kingma & Ba, 2014) with early stopping based on validation error. The learning rate and architectural parameters of the neural networks are tuned via hyperparameter search on the validation data. Learning algorithm: Adam with learning rate 0.001, no AMSGrad. Batch size: 100 Max epochs: 400 Weight decay: 2.5e-5. |