Counterfactual Temporal Point Processes

Authors: Kimia Noorbakhsh, Manuel Rodriguez

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Simulation experiments using synthetic and real epidemiological data show that the counterfactual realizations provided by our algorithm may give valuable insights to enhance targeted interventions.
Researcher Affiliation Academia Kimia Noorbakhsh Sharif University of Technology kimianoorbakhsh@gmail.com Manuel Gomez Rodriguez Max Planck Institute for Software Systems manuelgr@mpi-sws.org
Pseudocode Yes Algorithm 1 It samples a counterfactual sequence of accepted events given a sequence of accepted and rejected events provided by Lewis thinning algorithm
Open Source Code Yes To facilitate research in this area, we release an open-source implementation of our algorithms and data at https://github.com/NetworksLearning/counterfactual-ttp.
Open Datasets Yes fitted using real event data from an Ebola outbreak in West Africa in 2013-2016 [52].
Dataset Splits No The paper describes generating synthetic data or sampling realizations from fitted models and then sampling counterfactual realizations, but it does not specify explicit train/validation/test dataset splits for reproducibility.
Hardware Specification Yes All experiments were performed on a machine with 48 Intel(R) Xeon(R) 3.00GHz CPU cores and 1.5TB.
Software Dependencies No The paper does not explicitly list specific software dependencies with version numbers (e.g., Python, PyTorch, or CUDA versions) in the main text or the included checklist.
Experiment Setup Yes In each experiment, we first sample 1,000 realizations from a process with one set of parameters using Algorithm 4 (or Algorithm 5). Then, we carry out the above mentioned intervention and, for each of the sampled realizations, we use Algorithm 2 (or Algorithm 3) to sample 100 counterfactual realizations under the resulting alternative set of parameters. ... In all experiments, Algorithms 1 3 use 100 samples from the posterior distribution P C | Xi=x,Λi=λ(ti) ; do(Λi=λm (ti))(Ui) of each Gumbel noise variable Ui,x to estimate the counterfactual thinning probabilities P C | Xi=x,Λi=λ(ti) ; do(Λi=λm (ti))(Xi).