Discovering Nonlinear PDEs from Scarce Data with Physics-encoded Learning

Authors: Chengping Rao, Pu Ren, Yang Liu, Hao Sun

ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate our method on three nonlinear PDE systems. The effectiveness and superiority of the proposed method over baseline models are demonstrated.
Researcher Affiliation Academia Northeastern University {rao.che, ren.pu, yang1.liu}@northeastern.edu Hao Sun Renmin University of China haosun@ruc.edu.cn
Pseudocode No The paper describes its methodology using textual descriptions and figures, but does not include any formal pseudocode or algorithm blocks.
Open Source Code Yes The dataset and training script for each case considered in this paper can be found in https://github.com/Raocp/Discover-PDE-with-Noisy-Scarce-Data.
Open Datasets Yes The dataset and training script for each case considered in this paper can be found in https://github.com/Raocp/Discover-PDE-with-Noisy-Scarce-Data.
Dataset Splits Yes Among the entire measurement, 10% data is split as the validation dataset.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., GPU models, CPU types, memory) used to run the experiments. It only mentions 'numerical simulations' and 'computer's memory limit' without specifying any hardware.
Software Dependencies No The paper mentions software components like 'Adam optimizer' and 'Runge-Kutta scheme' but does not provide specific version numbers for any software dependencies or libraries used in the implementation.
Experiment Setup Yes The learning rate is initialized to be 0.002 and decreases to 97% of the previous for every 200 iterations. Table C.1: 'Range of hyperparameters for the data reconstruction network.' listing specific values for kernel size, # layers, # channels, learning rate, and regularizer weight.