Sharp Bounds for Generalized Causal Sensitivity Analysis
Authors: Dennis Frauen, Valentyn Melnychuk, Stefan Feuerriegel
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, we propose a scalable algorithm to estimate our sharp bounds from observational data and perform extensive computational experiments to show the validity of our bounds empirically. |
| Researcher Affiliation | Academia | Dennis Frauen, Valentyn Melnychuk & Stefan Feuerriegel LMU Munich Munich Center for Machine Learning {frauen,melnychuk,feuerriegel}@lmu.de |
| Pseudocode | Yes | Algorithm 1: Causal sensitivity analysis with mediators |
| Open Source Code | Yes | 1Code is available at https://github.com/DennisFrauen/SharpCausalSensitivity. |
| Open Datasets | Yes | The preprocessed data is publically available at https://github.com/jopersson/covid19-mobility/blob/main/Data. |
| Dataset Splits | No | We perform hyperparameter tuning for our experiments on synthetic data using grid search on a validation set. ... The paper states that a validation set was used for hyperparameter tuning but does not provide specific details on the split percentage or methodology. |
| Hardware Specification | No | The paper does not specify any hardware components (e.g., CPU, GPU models, memory) used for the experiments. |
| Software Dependencies | No | We use feed-forward neural networks with softmax activation function... For densities, we use conditional normalizing flows [73] (neural spline flows [21])... We perform training using the Adam optimizer [41]. ... The paper mentions various software components and libraries (e.g., neural networks, normalizing flows, Adam optimizer) but does not provide specific version numbers for them. |
| Experiment Setup | Yes | We perform hyperparameter tuning for our experiments on synthetic data using grid search on a validation set. The tunable parameters and search ranges are shown in Table 2. ... Table 2: MODEL TUNABLE PARAMETERS SEARCH RANGE includes details for Epochs, Batch size, Learning rate, Hidden layer size, Number of spline bins, and Dropout probability. |