Physics-Aware Downsampling with Deep Learning for Scalable Flood Modeling
Authors: Niv Giladi, Zvika Ben-Haim, Sella Nevo, Yossi Matias, Daniel Soudry
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 5 Experiments In this section, we leverage our physics-aware DNN to accurately solve the shallow water equations on a coarse grid, over a variety of elevation maps and boundary conditions. Specifically, we: 1. Demonstrate that the system (Sec. 4.1) can locate hydraulically significant details of the elevation map, using the accumulated gradient after backpropagation through the PDEs. 2. Optimize the DNN over a single elevation map with multiple boundary conditions (eq. 4) and demonstrate better performance compared to traditional downsampling. This is a typical setting in operational flood modeling systems, where repeated simulations are performed with different boundary conditions. 3. Train the DNN over a training set of different elevation maps and boundary conditions (eq. 3) and show generalization capabilities on both elevation maps and boundary conditions. This would enable fast downsamping of maps in new areas. |
| Researcher Affiliation | Collaboration | Niv Giladi1,2 Zvika Ben-Haim1 Sella Nevo1 Yossi Matias1 Daniel Soudry2 1Google Research 2Technion Israel Institute of Technology {giladiniv, daniel.soudry}@gmail.com {zvika, sellanevo, yossi}@google.com |
| Pseudocode | No | No explicitly labeled pseudocode or algorithm blocks are present in the paper. |
| Open Source Code | Yes | A reference implementation accompanies the paper as well as documentation and code for dataset reproduction. ... 1The code for this paper is available at https://github.com/tech-submissions/physics-aware-downsampling |
| Open Datasets | Yes | To this end, we configured a dataset designed specifically for inundation modeling. Each sample consists of a fine grid elevation map, boundary conditions, and a simulation time. The ground truth of each sample is the corresponding fluid state, calculated on a fine grid. This dataset is available along with code for further use. ... We provide access to the full dataset along with code to reproduce the data and expand to more data samples. |
| Dataset Splits | Yes | We train such model on 4031 elevation maps, each with different boundary conditions. We validate the model on 1155 unseen elevation maps, each with different boundary conditions. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory specifications) are provided for the experimental setup. |
| Software Dependencies | No | The paper mentions software components like 'Res Net-18', 'Adam', and 'SGD', but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | The training loss was optimized using Adam [20], and a batch size of 32 samples. ... we performed our learning rate scan with in the relatively low range of [10 3, 10 1] to find the empirically optimal learning rate in terms of convergence rate and generalization, as well as numerical stability of the PDEs. Adam achieved better generalization and therefore is used. ... We used Adam with a learning rate 10 3, and a batch size of 32 samples. The training was done in 90 epochs. |