Nonlinear Causal Discovery with Latent Confounders

Authors: David Kaltenpoth, Jilles Vreeken

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically we show that it outperforms other state-of-the-art methods for causal discovery under latent confounding on synthetic and real-world data.
Researcher Affiliation Academia CISPA Helmholtz Center for Information Security, Germany.
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes All code and results can be found on the authors website.1 1https://eda.rg.cispa.io/prj/fanta/ We make all code and results available in the supplement.
Open Datasets Yes We evaluate it on the REGED dataset (Guyon et al., 2008)... Sachs dataset (Sachs et al., 2005).
Dataset Splits No The paper describes the data generation process for synthetic data and mentions sample sizes, but does not specify explicit training, validation, or test splits (e.g., percentages or counts).
Hardware Specification No The paper does not provide any specific hardware details (e.g., GPU/CPU models, memory specifications) used for running the experiments.
Software Dependencies No We implemented NOCADILAC using Tensorflow (Abadi et al., 2016) and perform optimization using Adam (Kingma & Ba, 2014). The paper mentions software names but does not provide specific version numbers for reproducibility.
Experiment Setup No The paper mentions using Adam for optimization but does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or other detailed training configurations.