Transfer Learning on Heterogeneous Feature Spaces for Treatment Effects Estimation
Authors: Ioana Bica, Mihaela van der Schaar
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | On a new semi-synthetic data simulation benchmark for heterogeneous transfer learning we not only demonstrate performance improvements of our heterogeneous transfer causal effect learners across datasets, but also provide insights into the differences between these learners from a transfer perspective. |
| Researcher Affiliation | Collaboration | Ioana Bica University of Oxford, Oxford, UK The Alan Turing Institute, London, UK ioana.bica@eng.ox.ac.uk Mihaela van der Schaar University of Cambridge, Cambridge, UK University of California, Los Angeles, USA The Alan Turing Institute, London, UK mv472@cam.ac.uk |
| Pseudocode | Yes | See Appendix C for the pseudo-code for the HTCE-learners. |
| Open Source Code | Yes | The code for the HTCE-learners can be found at https://github.com/vanderschaarlab and at https://github.com/ioanabica/HTCE-learners. |
| Open Datasets | Yes | We perform experiments using patient features from three real datasets with diverse characteristics (e.g. number of features, proportion of categorical features): Twins [43], TCGA [44] and MAGGIC [45]. |
| Dataset Splits | Yes | We use a 80%/10%/10% split for training, validation and test sets. |
| Hardware Specification | Yes | All experiments were run on NVIDIA V100 GPUs. |
| Software Dependencies | No | The paper mentions 'Adam optimizer [63]' but does not provide a specific version number for it or any other software component, which is required for reproducibility. |
| Experiment Setup | Yes | We use a learning rate of 0.001 with Adam optimizer [63] and train for 200 epochs, with early stopping based on the validation loss with a patience of 10. |