MAgNet: Mesh Agnostic Neural PDE Solver

Authors: Oussama Boussif, Yoshua Bengio, Loubna Benabbou, Dan Assouline

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our Mesh Agnostic Neural PDE Solver (MAg Net) is able to make accurate predictions across a variety of PDE simulation datasets and compares favorably with existing baselines. Moreover, MAg Net generalizes well to different meshes and resolutions up to four times those trained on2. We evaluate all models on both 1D and 2D simulations (the datasets generated from 1D and 2D PDEs are presented in Appendix A.4). All models are evaluated using the Mean Absolute Error (MAE) on the rolled out predictions averaged across time and space:
Researcher Affiliation Academia Oussama Boussif Mila Québec AI Institute DIRO, Université de Montréal oussama.boussif@mila.quebec Dan Assouline Mila Québec AI Institute DIRO, Université de Montréal dan.assouline@mila.quebec Loubna Benabbou Université du Québec à Rimouski Loubna_Benabbou@uqar.ca Yoshua Bengio Mila Québec AI Institute DIRO, Université de Montréal yoshua.bengio@mila.quebec
Pseudocode No The paper describes the framework and calculations with mathematical equations and figures, but does not include a dedicated pseudocode or algorithm block.
Open Source Code Yes Code and dataset can be found on: https://github.com/jaggbow/magnet
Open Datasets Yes Code and dataset can be found on: https://github.com/jaggbow/magnet
Dataset Splits No All training sets in the 1D case contain 2048 simulations and test sets contain 128 simulations. For the 2D case, training sets contain 1000 simulations and test sets contain 100 simulations. While early stopping is mentioned, no explicit validation set split percentages or counts are provided.
Hardware Specification No The paper does not specify any particular hardware (e.g., GPU/CPU models, memory amounts) used for running the experiments. It only generally refers to 'modern supercomputers'.
Software Dependencies Yes Pytorchlightning/pytorch-lightning: 0.7.6 release.
Experiment Setup Yes We train models for 250 epochs with early stopping with a patience of 40 epochs. See Appendix A.6 and A.7 for more implementation details.