Lagrangian Flow Networks for Conservation Laws

Authors: Fabricio Arend Torres, Marcello Massimo Negri, Marco Inversi, Jonathan Aellen, Volker Roth

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We assess LFlows in multiple application settings and show better predictive performance than existing methods while staying computationally feasible and physically consistent. As a real-world application, we model bird migration based on sparse weather radar measurements.
Researcher Affiliation Academia Fabricio Arend Torres, Marcello M. Negri, Marco Inversi, Jonathan Aellen & Volker Roth Department of Mathematics and Computer Science University of Basel
Pseudocode No The paper does not contain any clearly labeled pseudocode or algorithm blocks.
Open Source Code Yes We provide code at https://github.com/bmda-unibas/Lagrangian Flow Networks. We furthermore provide access to the used conditional bijective layers in a separate python package called Flow Conductor2. This package includes conditional i-Dense Nets with sinusoidal activations, as well as the conditional SVD layers.
Open Datasets Yes The data provided by Nussbaumer et al. (2021) is originally based on weather radar measurements made available by the European Operational Program for Exchange of Weather Radar Information (EUMETNET/OPERA). ... The final density and velocity measurements we use are openly available5. [Footnote 5 points to https://zenodo.org/record/4587338/]
Dataset Splits Yes Hyperparameters of all models are selected by minimizing the density MSE on three nights of March 2018. We optimize each model based on the explained variance (R2) of the density on validation data.
Hardware Specification Yes We limit all models to the computing resources of a NVIDIA Titan X Pascal. Each individual experiment for the synthetic data was run on individual NVIDIA TITAN X GPUs (12GB VRAM), using 20 CPU cores and 20GB RAM. The experiment was run on an A100 GPU (40GB VRAM), using 20 CPUs and 30GB RAM.
Software Dependencies Yes A cleaned anaconda environment file for reproducing the python environment is provided. Our code for LFlows is based on the nflows library for bijective neural networks (Durkan et al., 2020). The adjoint is computed with the torchdiffeq library (Chen, 2018).
Experiment Setup Yes We trained the LFlows and PINNs on a minibatch size of 16384 and the SLDA on a minibatch size of 4096. LFlows. The model was trained with the ADAM optimizer for 5000 iterations with a learning rate of 2e-3 and 2048 data points per iteration. We trained for 50 epochs with a minibatch size of 16384 using the ADAM optimizer with a learning rate of 1e-2, a weight decay of 2e-3 and a cosine annealing learning rate schedule.