Probabilistic Weather Forecasting with Hierarchical Graph Neural Networks

Authors: Joel Oskarsson, Tomas Landelius, Marc Deisenroth, Fredrik Lindsten

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We experiment with the model on both global and limited area forecasting.
Researcher Affiliation Collaboration Joel Oskarsson Linköping University joel.oskarsson@liu.se Tomas Landelius Swedish Meteorological and Hydrological Institute tomas.landelius@smhi.se Marc Peter Deisenroth University College London m.deisenroth@ucl.ac.uk Fredrik Lindsten Linköping University fredrik.lindsten@liu.se
Pseudocode Yes Algorithm 1 Single-step prediction f for graph-based MLWP
Open Source Code Yes Our code is available at https://github.com/mllam/neural-lam/tree/prob_model_global (global forecasting) and https://github.com/mllam/neural-lam/tree/prob_model_lam (LAM).
Open Datasets Yes The dataset used for training and evaluation is a 1.5 version of the global ERA5 reanalysis3 [17], provided through the Weather Bench 2 benchmark [40].
Dataset Splits Yes We use the years 1959 2017 for training, 2018 2019 for validation and 2020 as a test set.
Hardware Specification Yes The models are implemented2 in Py Torch and trained on 8 A100 80 GB GPUs in a data-parallel configuration.
Software Dependencies No The paper mentions 'Py Torch' but does not specify a version number or other software dependencies with their versions.
Experiment Setup Yes We train all models using the Adam W optimizer [57] and utilize BFloat16 mixed precision to save GPU memory. The training costs for the models makes extensive hyperparameter tuning unfeasible. We choose hyperparameters based on initial experimentation with smaller models. For Graph-EFM the important weightings λKL and λCRPS in L can be chosen based on monitoring the model behavior during training. The full training schedule for the deterministic models is given in table 7 and for the probabilistic models in table 8.