Solving High-Dimensional PDEs with Latent Spectral Models
Authors: Haixu Wu, Tengge Hu, Huakun Luo, Jianmin Wang, Mingsheng Long
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimentally, LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks covering both solid and fluid physics. |
| Researcher Affiliation | Academia | Haixu Wu 1 Tengge Hu 1 Huakun Luo 1 Jianmin Wang 1 Mingsheng Long 1 1School of Software, BNRist, Tsinghua University. |
| Pseudocode | No | The paper describes the proposed Latent Spectral Models and its components (hierarchical projection network, neural spectral block) in text and through a diagram (Figure 2), but it does not include any explicitly labeled 'Pseudocode' or 'Algorithm' blocks. |
| Open Source Code | Yes | Code is available at https://github.com/thuml/Latent Spectral-Models. |
| Open Datasets | Yes | We take the Navier-Stokes dataset from (Li et al., 2021). We use the Darcy dataset proposed in (Li et al., 2021) |
| Dataset Splits | No | The paper mentions 'TRAIN SET SIZE' and 'TEST SET SIZE' in Table 5 but does not explicitly detail a 'validation set' or 'validation split' for the reproduction of experiments. |
| Hardware Specification | Yes | All experiments are repeated five times, implemented in Py Torch (Paszke et al., 2019) and conducted on a single NVIDIA RTX 3090 24GB GPU. |
| Software Dependencies | No | The paper states 'implemented in Py Torch (Paszke et al., 2019)' but does not provide specific version numbers for PyTorch or any other software libraries or dependencies used in the experiments. |
| Experiment Setup | Yes | For fairness, all the methods are trained with L2 loss and 500 epochs, using the ADAM (Kingma & Ba, 2015) optimizer with an initial learning rate of 10 3. The batch size is set to 20. |