Deep Generalized Method of Moments for Instrumental Variable Analysis

Authors: Andrew Bennett, Nathan Kallus, Tobias Schnabel

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical results show our algorithm matches the performance of the best tuned methods in standard settings and continues to work in high-dimensional settings where even recent methods break. and In this section, we compare Deep GMM against a wide set of baselines for IV estimation.
Researcher Affiliation Collaboration Andrew Bennett Cornell University awb222@cornell.edu Nathan Kallus Cornell University kallus@cornell.edu Tobias Schnabel Microsoft Research tbs49@cornell.edu
Pseudocode No The paper describes the algorithm and optimization process but does not include a formal pseudocode block or algorithm listing.
Open Source Code Yes Our implementation of Deep GMM is publicly available at https://github.com/Causal ML/Deep GMM.
Open Datasets Yes We now move on to scenarios based on the MNIST dataset [26] in order to test our method s ability to deal with structured, high-dimensional X and Z variables.
Dataset Splits Yes We sample n = 2000 points for train, validation, and test sets each.
Hardware Specification No The paper does not provide specific hardware details such as GPU/CPU models, memory, or cloud computing specifications used for running the experiments.
Software Dependencies No The paper mentions implementing Deep GMM using PyTorch but does not provide a specific version number for PyTorch or any other software dependencies with their versions.
Experiment Setup Yes Hyperparameters used for our method in these scenarios are described in Appendix B.2. and The only parameters of our algorithm are the neural network architectures for F and G and the optimization algorithm parameters (e.g., learning rate).