Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
PeSANet: Physics-encoded Spectral Attention Network for Simulating PDE-Governed Complex Systems
Authors: Han Wan, Rui Zhang, Qi Wang, Yang Liu, Hao Sun
IJCAI 2025 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | To rigorously evaluate the efficacy and generalizability of the proposed Pe SANet, we conducted a series of comprehensive experiments on a wide range of complex systems, such as fluid dynamics and reaction-diffusion systems. Specifically, the model is evaluated on the two-dimensional Burgers equation, the two-dimensional Fitz Hugh-Nagumo (FN) system, the two-dimensional Gray-Scott (GS) system, and the two-dimensional Navier-Stokes equations (NSE). The experimental results demonstrate that Pe SANet outperforms existing methods across all metrics, particularly in long-term forecasting accuracy, providing a promising solution for simulating complex systems with limited data and incomplete physics. |
| Researcher Affiliation | Academia | Han Wan1, Rui Zhang1, Qi Wang1, Yang Liu2, Hao Sun1, 1Gaoling School of Artificial Intelligence, Renmin University of China, Beijing, China 2School of Engineering Science, University of Chinese Academy of Sciences, Beijing, China EMAIL, EMAIL |
| Pseudocode | No | The paper describes the methodology in detail, including the architecture of Pe SANet and its components, but it does not include any explicitly labeled 'Pseudocode' or 'Algorithm' block, nor structured steps formatted like code. |
| Open Source Code | Yes | The source code and data are found at https://github.com/intell-sci-comput/Pe SANet. |
| Open Datasets | Yes | The source code and data are found at https://github.com/intell-sci-comput/Pe SANet. We considered several two-dimensional PDE-governed nonlinear complex systems, including the Burgers equation, the Fitz Hugh-Nagumo (FN) system, the Gray-Scott (GS) system, and the Navier-Stokes equations (NSE). We summarize the datasets in Table 1, with a more detailed introduction in Appendix Dataset Informations. |
| Dataset Splits | Yes | In this paper, we target the data-scare scenario, and each experiment includes 2-5 trajectories in the training set. We summarize the datasets in Table 1, with a more detailed introduction in Appendix Dataset Informations. Table 1: Summary of experimental settings for different cases. Case ... Training Trajectories ... Test Trajectories Burgers ... 5 ... 5 FN ... 5 ... 5 GS ... 2 ... 5 NSE ... 5 ... 14 |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU models, CPU types, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions that the model is optimized using the Adam optimizer, but it does not specify any software names with version numbers (e.g., Python, PyTorch, TensorFlow, CUDA versions). |
| Experiment Setup | No | We train our model autoregressively, where the model predicts an output at each iteration, and the output is used as input for the next prediction. The mean squared error loss function is utilized for training, optimized using the Adam optimizer. While the loss function and optimizer are mentioned, specific hyperparameters such as learning rate, batch size, or number of epochs are not provided in the main text. |