Meta-Auto-Decoder for Solving Parametric Partial Differential Equations
Authors: Xiang Huang, Zhanhong Ye, Hongsheng Liu, Shi Ji, Zidong Wang, Kang Yang, Yang Li, Min Wang, Haotian CHU, Fan Yu, Bei Hua, Lei Chen, Bin Dong
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive numerical experiments are carried out to demonstrate the effectiveness of our method, which show that MAD can significantly improve the convergence speed and has good extrapolation ability for OOD settings. |
| Researcher Affiliation | Collaboration | Xiang Huang sahx@mail.ustc.edu.cn University of Science and Technology of China Zhanhong Ye yezhanhong@pku.edu.cn Peking University Hongsheng Liu liuhongsheng4@huawei.com Huawei Technologies Co. Ltd ... Bei Hua bhua@ustc.edu.cn University of Science and Technology of China Lei Chen leichen@cse.ust.hk Hong Kong University of Science and Technology Bin Dong B dongbin@math.pku.edu.cn Beijing International Center for Mathematical Research, Peking University Center for Machine Learning Research, Peking University |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. It includes diagrams and descriptive text of the method but no formal algorithm steps. |
| Open Source Code | Yes | The project page with code is available: https://gitee.com/mindspore/mindscience/tree/master/MindElec/. |
| Open Datasets | No | The initial condition u0(x) is generated using Gaussian random field (GRF) [24]... The variable PDE parameters include the shape of the solution domain (the shape of the triangle) and the boundary conditions on the three sides of the triangle, so the PDE parameters here are heterogeneous. The paper describes how data is generated (e.g., using Gaussian random fields) but does not provide a link or specific access information for a pre-existing public dataset. |
| Dataset Splits | No | For each experiment, the PDE parameters are divided into two sets: S1 and S2. Parameters in S1 correspond to sample tasks for pre-training, and parameters in S2 correspond to new tasks for fine-tuning. ... Did you specify all the training details (e.g., data splits, hyperparameters, how they were chosen)? No |
| Hardware Specification | No | Unless otherwise specified, all the experiments are conducted under the Mind Spore. No specific hardware details (like GPU or CPU models, memory, or cloud provider names with specs) are mentioned for running experiments. The reproducibility checklist also states 'No' for 'total amount of compute and the type of resources used'. |
| Software Dependencies | No | Unless otherwise specified, all the experiments are conducted under the Mind Spore. No specific version numbers for Mind Spore or any other software dependencies are provided. |
| Experiment Setup | No | See Appendix B for the default experimental setup, and more detailed experimental setups and results for Burgers equation, Maxwell s equation and Laplace s equation are given in Appendix C, D and E respectively. ... Did you specify all the training details (e.g., data splits, hyperparameters, how they were chosen)? No |