Exploiting Independent Instruments: Identification and Distribution Generalization
Authors: Sorawit Saengkyongam, Leonard Henckel, Niklas Pfister, Jonas Peters
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on both simulated and real world data in Section 5 confirm that HSIC-X can exploit the improved identifiability guarantees and can be more efficient in finite samples if wrong solutions yield both second and higher order dependencies between the residuals and the instruments. |
| Researcher Affiliation | Academia | Sorawit Saengkyongam 1 Leonard Henckel 1 Niklas Pfister 1 Jonas Peters 1 1Department of Mathematical Sciences, University of Copenhagen, Denmark. Correspondence to: Sorawit Saengkyongam <ss@math.ku.dk>. |
| Pseudocode | Yes | Algorithm 1 HSIC-X Algorithm 2 HSIC-X-pen |
| Open Source Code | Yes | The code for all the experiments is available at https://github.com/sorawitj/HSIC-X. |
| Open Datasets | Yes | We apply HSIC-X to estimate the causal effect of education on earnings using 3010 observations from the 1979 National Longitudinal Survey of Young Men (Card, 1995). |
| Dataset Splits | No | The paper mentions generating observations and using a 'test sample of size 10000', but it does not explicitly state details about training or validation splits, such as percentages or sample counts for a validation set. |
| Hardware Specification | No | The paper mentions using Adam as an optimizer and neural networks in its experiments, but it does not provide any specific details about the hardware used, such as GPU models, CPU types, or memory specifications. |
| Software Dependencies | No | The paper states: 'In all the experiments, we used Adam' and 'We optimize our model using Adam optimizer...and use the R package Anchor Regression (https://github.com/simzim96/Anchor Regression)'. While specific optimizers and a package are mentioned, no version numbers are provided for these software components. |
| Experiment Setup | Yes | In all the experiments, we use Adam as the optimizer with the learning rate set to 0.01 and a batch-size of 256. |