GANITE: Estimation of Individualized Treatment Effects using Generative Adversarial Nets

Authors: Jinsung Yoon, James Jordon, Mihaela van der Schaar

ICLR 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test our method on three real-world datasets (with both binary and multiple treatments) and show that GANITE outperforms state-of-the-art methods.
Researcher Affiliation Academia Jinsung Yoon Department of Electrical and Computer Engineering University of California, Los Angeles Los Angeles, CA 90095, USA jsyoon0823@g.ucla.edu James Jordon Department of Engineering Science University of Oxford Oxford, UK james.jordon@wolfson.ox.ac.uk Mihaela van der Schaar Department of Engineering Science, University of Oxford, Oxford, UK Alan Turing Institute, London, UK mihaela.vanderschaar@eng.ox.ac.uk
Pseudocode Yes Algorithm 1 Pseudo-code of GANITE
Open Source Code No The paper mentions a link to code for a baseline method (CFRWASS): "For instance, the hyper-parameters of CFRW ASS are optimized using cfr_param_search.py file which is published in https: //github.com/clinicalml/cfrnet". However, it does not provide a link or explicit statement about the availability of the source code for the GANITE method itself.
Open Datasets Yes IHDP and Jobs are well described in Shalit et al. (2017); Hill (2011); Dehejia & Wahba (2002a) and the Appendix. Twins: This dataset is derived from all births in the USA between 1989-1991 (Almond et al. (2005)).
Dataset Splits Yes Each dataset is divided 56/24/20% into training/validation/testing sets.
Hardware Specification No The paper does not provide specific hardware details used for running the experiments.
Software Dependencies No The paper mentions "Xavier Initialization for Weight matrix, Zero initialization for bias vector" and "Adam Moment Optimization" which are methods, not specific software with version numbers. It does not list any software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes Hyper-parameters such as the number of hidden layers α and β are chosen using Random Search (Bergstra & Bengio (2012)). Details about the hyper-parameters are discussed in the Appendix. Table 6: Hyper-parameters of GANITE Blocks Sets of Hyper-parameters, Table 7: Optimal Hyper-parameters of GANITE Dataset Optimal Hyper-parameters