Cumulants of Hawkes Processes are Robust to Observation Noise

Authors: William Trouleau, Jalal Etesami, Matthias Grossglauser, Negar Kiyavash, Patrick Thiran

ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To illustrate the result of Theorem 1 and to characterize the effect of random translations on the estimation of MHPs, we carry out two sets of experiments. First, we simulate a synthetic dataset from an MHP and quantify the ability of two maximum likelihood-based and two cumulant-based approaches for learning the ground-truth excitation matrix under varying levels of noise power. Second, we evaluate the stability of each approach to random translations on a real dataset pertaining to Bund Future traded at Eurex.
Researcher Affiliation Academia 1School of Computer and Communication Sciences, EPFL, Lausanne, Switzerland 2College of Management of Technology, EPFL, Lausanne, Switzerland.
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code Yes In addition, the open-source code is publicly available on Git Hub5. 5https://github.com/trouleau/noisy-hawkes-cumulants
Open Datasets Yes We also evaluated the effect of random translations on a publicly available real-world dataset of Bund Futures traded at Eurex7. 7The dataset is publicly available at: https://github. com/X-Data Initiative/tick-datasets/
Dataset Splits No The paper mentions simulating datasets and using a real-world dataset for evaluation but does not provide specific details on train/validation/test splits, such as percentages or sample counts, or refer to predefined splits.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper does not list specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9) required to replicate the experiment.
Experiment Setup Yes We considered a non-symmetric block-matrix G depicted in Figure 4(a), with exponential excitation functions G i,j(t) = αi,jβ exp( βt), i, j, with β = 1, and baseline intensity µi = 0.01, i. We simulated 20 datasets, each comprised of 5 realizations of 105 events. We then randomly translated each dataset with distributions Fi N(0, σ2), 1 i d, for varying noise powers σ2, and we estimated the excitation matrix for the aforementioned approaches. All reported values are averaged over the 20 simulated datasets ( standard error).