Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave Functions
Authors: Nicholas Gao, Stephan Günnemann
ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In our experiments, our Potential Energy Surface Network (PESNet) consistently matches or surpasses the results of the previous best neural wave functions while training less than 1 40 of the time for high-resolution potential energy surface scans. To investigate PESNet s accuracy and training time benefit, we compare it to Fermi Net (Pfau et al., 2020; Spencer et al., 2020), Pauli Net (Hermann et al., 2020), and Deep Erwin (Scherbela et al., 2021) on diverse systems ranging from 3 to 28 electrons. |
| Researcher Affiliation | Academia | Nicholas Gao & Stephan G unnemann Department of Informatics & Munich Data Science Institute Technical University of Munich, Germany {gaoni,guennemann}@in.tum.de |
| Pseudocode | No | The paper includes architectural diagrams (e.g., Figure 2) and descriptive text, but it does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | To reduce the likelihood of such misuse, we publish our source code under the Hippocratic license (Ehmke, 2019)1. To facilitate reproducibility, the source code includes simple scripts to reproduce all experiments from Section 4. Furthermore, we provide a detailed schematic of the computational graph in Figure 2 and additional details on the experimental setup including all hyperparameters in Appendix D. 1https://www.daml.in.tum.de/pesnet |
| Open Datasets | Yes | For H+ 4 and cyclobutadiene, we train on discrete sets of geometries from the literature (Scherbela et al., 2021; Kinal & Piecuch, 2007). The hydrogen chain is a very common benchmark geometry that allows us to compare our method to a range of classical methods (Motta et al., 2017)... |
| Dataset Splits | No | The paper discusses training on continuous subsets of potential energy surfaces and evaluating on configurations where reference calculations are available. However, it does not provide specific details on train/validation/test dataset splits (e.g., percentages or sample counts) for reproducibility. |
| Hardware Specification | Yes | All measurements have been conducted on a machine with 16 AMD EPYC 7543 cores and a single Nvidia A100 GPU. |
| Software Dependencies | No | The paper mentions using JAX for its implementation and compares with PyTorch and TensorFlow implementations of other methods, but it does not provide specific version numbers for these or any other software dependencies. |
| Experiment Setup | Yes | The exact procedure and the general experimental setup are described in Appendix D. Additional ablation studies are available in Appendix E. If not otherwise specified we used the hyperparameters from Table 2. |