Periodic Graph Transformers for Crystal Material Property Prediction

Authors: Keqiang Yan, Yi Liu, Yuchao Lin, Shuiwang Ji

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on multiple common benchmark datasets show that our Matformer outperforms baseline methods consistently. In addition, our results demonstrate the importance of periodic invariance and explicit repeating pattern encoding for crystal representation learning. Our code is publicly available at https://github.com/YKQ98/Matformer.
Researcher Affiliation Academia Keqiang Yan Computer Science & Engineering Texas A&M University College Station, TX 77843 keqiangyan@tamu.edu Yi Liu Computer Science Florida State University Tallahassee, FL 32306 liuy@cs.fsu.edu Yuchao Lin Computer Science & Engineering Texas A&M University College Station, TX 77843 kruskallin@tamu.edu Shuiwang Ji Computer Science & Engineering Texas A&M University College Station, TX 77843 sji@tamu.edu
Pseudocode No The paper does not contain a pseudocode block or algorithm block.
Open Source Code Yes Our code is publicly available at https://github.com/YKQ98/Matformer.
Open Datasets Yes We conduct experiments on two commonly used material benchmark datasets, including the Materials Project [16] and Jarvis [8].
Dataset Splits Yes To make the comparison clear and fair, we retrain all corresponding models using exactly the same training, validation and test sets across all methods and report the results in Table. 1.
Hardware Specification Yes We adjust the Matformer configurations to train these large graphs on a single RTX A6000 GPU.
Software Dependencies No The paper mentions the use of Adam optimizer, weight decay, and a learning rate scheduler, but does not provide specific version numbers for software libraries (e.g., PyTorch version, CUDA version).
Experiment Setup Yes All Matformer models are trained using the Adam optimizer [21] with weight decay [32] and one cycle learning rate scheduler [43]. We only slightly adjust learning rates from 0.001 and training epochs from 500 for different tasks. Detailed Matformer configurations for different tasks are provided in Appendix. A.4.