Causal Structure Learning for Latent Intervened Non-stationary Data

Authors: Chenxi Liu, Kun Kuang

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on both synthetic and real-world datasets demonstrate that our method outperforms the baselines on causal structure learning for latent intervened non-stationary data.
Researcher Affiliation Academia 1College of Computer Science and Technology, Zhejiang University, Zhejiang, China. Correspondence to: Kun Kuang <kunkuang@zju.edu.cn>.
Pseudocode Yes Algorithm 1 Latent Intervention Learning
Open Source Code Yes Our code is available at LIN2023 on Git Hub.
Open Datasets Yes The dataset is provided by Copernicus Climate Change Service information (Hersbach et al., 2023).
Dataset Splits No The paper mentions 'hold-out data' for hyper-parameter selection and refers to 'test set' for evaluation, but does not provide specific percentages or sample counts for training, validation, and test splits needed for reproduction.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU, GPU models, or memory) used for running the experiments.
Software Dependencies No The paper mentions several software packages like 'causal-learn', 'lingam', 'tigramite', 'dynotears', and 'CPF SAEM' for baselines, and discusses neural networks for its method, but it does not specify any version numbers for these or other software dependencies.
Experiment Setup Yes Table 8. Hyper-parameter setting