Modeling Dynamics over Meshes with Gauge Equivariant Nonlinear Message Passing

Authors: Jung Yeon Park, Lawson Wong, Robin Walters

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate Hermes on several linear and nonlinear partial differential equations, and also on shape correspondence, object interaction system, and cloth dynamics. Our experiments show that Hermes outperforms convolutional or attentional counterparts in most domains, particularly on nonlinear surface PDEs.
Researcher Affiliation Academia Jung Yeon Park, Lawson L.S. Wong , Robin Walters Khoury College of Computer Sciences Northeastern University Boston, MA 02115 {park.jungy@northeastern.edu, lsw@ccs.neu.edu, r.walters@northeastern.edu}
Pseudocode No The paper does not contain explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes 1Project website with code: https://jypark0.github.io/hermes
Open Datasets Yes For the PDE datasets, we use example meshes in the Py Vista library [30] and generate 5 trajectories... The FAUST dataset [74] consists of 80 train and 20 test high resolution scans... We also include the Flag Simple dataset from [20]...
Dataset Splits Yes For the PDE datasets... For test time, we train on timesteps T = 0, . . . , 149 and test on T = 150, . . . , 200. For test init, we test on trajectories with new initial conditions. For test mesh, we evaluate on completely unseen meshes...
Hardware Specification No The paper mentions using 'the Discovery cluster' but does not provide specific hardware details (e.g., GPU/CPU models, memory) of the computing resources used for experiments.
Software Dependencies No The paper mentions software like Py Vista, DOLFINx, and Arc Sim, but does not provide specific version numbers for these or other key software components.
Experiment Setup Yes Table 7 contains the architecture details and hyperparameters used for each domain. We use a band limit of 4 for the gauge-equivariant kernels. Throughout, L2 regularization of the weights with a coefficient of 1 10 5 was used. A cosine learning rate scheduler was used for the Objects and Cahn-Hilliard datasets.