Scalable Quasi-Bayesian Inference for Instrumental Variable Regression

Authors: Ziyu Wang, Yuhao Zhou, Tongzheng Ren, Jun Zhu

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We analyze the theoretical properties of the proposed quasi-posterior, and demonstrate through empirical evaluation the competitive performance of our method. [...] 6 Experiments
Researcher Affiliation Collaboration 1 Dept. of Comp. Sci. and Tech., BNRist Center, State Key Lab for Intell. Tech. & Sys., Institute for AI, Tsinghua-Bosch Joint Center for ML, Tsinghua University 2 Department of Computer Science, UT Austin
Pseudocode No The paper describes the algorithm steps in text (e.g., 'The algorithm has the form of stochastic gradient descent-ascent') but does not provide pseudocode or a clearly labeled algorithm block in the main text.
Open Source Code Yes Code to reproduce the experiments is available at https://github.com/meta-inf/qbdiv.
Open Datasets Yes We now turn to the more challenging demand simulation first proposed by [4].
Dataset Splits No The paper mentions 'cross validation' for hyperparameter selection but does not specify explicit percentages, sample counts, or specific pre-defined splits for training, validation, and test sets in the main text.
Hardware Specification No The paper mentions running experiments on 'CPU' and 'GPU' (Table 1), and 'a single accelerator' (Table 1 caption), but no specific models, brands, or detailed specifications for the hardware used are provided.
Software Dependencies No The paper does not provide specific software dependencies with version numbers.
Experiment Setup No The paper states 'Hyperparameter for the kernelized IV methods are selected by cross validation...see Appendix D.1. For kernels we choose the RBF and Matérn kernels...See Appendix D.3 for the detailed setup.' but does not list concrete hyperparameter values or training configurations in the main text.