Neural Jump Stochastic Differential Equations

Authors: Junteng Jia, Austin R. Benson

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the predictive capabilities of our model on a range of synthetic and real-world marked point process datasets, including classical point processes (such as Hawkes processes), awards on Stack Overflow, medical records, and earthquake monitoring.
Researcher Affiliation Academia Junteng Jia Cornell University jj585@cornell.edu Austin R. Benson Cornell University arb@cs.cornell.edu
Pseudocode Yes The complete algorithm for simulating the hybrid system with stochastic events is described in Appendix A.1.
Open Source Code Yes The complete implementation of our algorithms and experiments are available at https://github.com/000Justin000/torchdiffeq/tree/jj585.
Open Datasets Yes We use our model to predict the time and locations of earthquakes above level 4.0 in 2007 2018 using historical data from 1970 2006. Data from https://www.kaggle.com/danielpe/earthquakes
Dataset Splits Yes For each generative process, we create a dataset by simulating 500 event sequences within the time interval [0, 100] and use 60% for training, 20% for validation and 20% for testing.
Hardware Specification Yes We train all of our models on a workstation with a 8 core i7-7700 CPU @ 3.60GHz processor and 32 GB memory.
Software Dependencies No The paper mentions using the "Adam optimizer" and refers to its implementation being available via a GitHub link that includes "torchdiffeq", but it does not provide specific version numbers for these or other software dependencies.
Experiment Setup Yes We use the Adam optimizer with β1 = 0.9, β2 = 0.999; the architectures, hyperparameters, and learning rates for different experiments are reported below. ... the learning rate for the Adam optimizer is set to be 10 3 with weighted decay rate 10 5.