Neural Jump-Diffusion Temporal Point Processes
Authors: Shuai Zhang, Chuan Zhou, Yang Aron Liu, Peng Zhang, Xixun Lin, Zhi-Ming Ma
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on both synthetic and real-world datasets demonstrate that NJDTPP is capable of capturing the dynamics of intensity processes in different scenarios and significantly outperforms the state-of-the-art TPP models in prediction tasks. |
| Researcher Affiliation | Academia | 1Academy of Mathematics and Systems Science, Chinese Academy of Sciences 2School of Cyber Security, University of Chinese Academy of Sciences 3Cyberspace Institute of Advanced Technology, Guangzhou University 4Institute of Information Engineering, Chinese Academy of Sciences. |
| Pseudocode | Yes | The pseudo-codes of training algorithm of the Neural Jump-Diffusion Univariate Point Process (NJDUPP) and the Neural Jump-Diffusion Multivariate Point Process (NJDMPP) are presented in Algorithm 1 and Algorithm 2, respectively. |
| Open Source Code | Yes | Our code is available at https://github.com/Zh-Shuai/NJDTPP. |
| Open Datasets | Yes | The MIMIC-II dataset is available at the public Github repository2, and all other datasets are available at the public Easy TPP3 library (Xue et al., 2024), an open benchmark for evaluating TPPs. See Appendix C.2 for dataset details. |
| Dataset Splits | Yes | The train-validation-test data split is 3 : 1 : 1. |
| Hardware Specification | Yes | The experiments are conducted on a Linux server with eight GPUs (NVIDIA RTX 2080 Ti * 8). |
| Software Dependencies | No | The paper mentions using 'Py Torch' and 'Adam optimizer' but does not specify their version numbers, which is required for reproducibility. |
| Experiment Setup | Yes | Grid search is used to determine other hyper-parameters: the learning rate is selected from {0.001, 0.01, 0.1}, the hidden layer number is selected from {1, 2, 3}, and the hidden layer size is selected from {16, 32, 64}. |