Deep Gaussian Markov Random Fields for Graph-Structured Dynamical Systems

Authors: Fiona Lippert, Bart Kranstauber, Emiel van Loon, Patrick Forré

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In experiments on synthetic and real world data, we demonstrate that our approach provides accurate state and uncertainty estimates, comparing favorably to other scalable approaches relying on ensembles or simplifications of the dependency structure.
Researcher Affiliation Academia Fiona Lippert University of Amsterdam f.lippert@uva.nl Bart Kranstauber University of Amsterdam b.kranstauber@uva.nl E. Emiel van Loon University of Amsterdam e.e.vanloon@uva.nl Patrick Forré University of Amsterdam p.d.forre@uva.nl
Pseudocode No The paper describes the model and methods using mathematical equations and textual explanations, but it does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not provide an explicit statement or a link to its own open-source code for the described methodology. It mentions using 'existing software for graph neural networks' but not releasing their implementation.
Open Datasets Yes To test our method on a real world system exhibiting more complex dynamics and graph structures, we conduct experiments on an air quality dataset obtained from [59]. The dataset contains hourly PM2.5 measurements from 246 sensors distributed around Beijing, China, covering a time period of K = 400 hours.
Dataset Splits Yes For all experiments, we use the masked pixels as test set, and 10% of the observed pixels as validation set for hyperparameter tuning.
Hardware Specification Yes We implemented ST-DGMRF in Pytorch and Pytorch Geometric, and conducted experiments on a consumer-grade GPU... Most computations were performed on a Nvidia Titan X GPU.
Software Dependencies No The paper mentions 'Pytorch and Pytorch Geometric' and the 'Python statsmodels package' but does not specify the version numbers for these software components, which is required for reproducible dependency descriptions.
Experiment Setup Yes In all experiments, we optimize parameters for 10 000 iterations using Adam [32] with learning rate 0.01, and draw 100 posterior samples to estimate marginal variances. Unless specified otherwise, we use Lspatial = 2, Ltemporal = 4 and p = 1, and define the variational distribution based on one spatial layer and one temporal diffusion layer. ... The observation noise is assumed to be uniform with σ = 0.01.