A Diffusion-Based Pre-training Framework for Crystal Property Prediction
Authors: Zixing Song, Ziqiao Meng, Irwin King
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments demonstrate that Crys Diff can significantly improve the performance of the downstream crystal property prediction task on multiple target properties, outperforming all the SOTA pre-training models for crystals with good margins on the popular JARVIS-DFT dataset. |
| Researcher Affiliation | Academia | Zixing Song, Ziqiao Meng, Irwin King The Chinese University of Hong Kong zxsong@cse.cuhk.edu.hk, zqmeng@cse.cuhk.edu.hk, king@cse.cuhk.edu.hk |
| Pseudocode | Yes | Algorithm 1: Pre-training Phase of Crys Diff |
| Open Source Code | No | No explicit statement or link providing access to open-source code for the described methodology. |
| Open Datasets | Yes | We collect 800K untagged crystal graph data from two popular materials databases, Materials Project (MP) (Jain et al. 2013) and OQMD (Saal et al. 2013), to pre-train the Crys Diff model. Further, to evaluate the fine-tuning performance of Crys Diff compared with other crystal property predictors, we select the 2021.8.18 version of JARVIS-DFT (Choudhary et al. 2020), another popular materials database, for the downstream property prediction task. |
| Dataset Splits | Yes | For each property, all the models are trained on 80% data, validated on 10%, and tested on 10% of the data. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory) are provided for running experiments. |
| Software Dependencies | No | No specific software dependencies with version numbers (e.g., Python version, library versions) are listed. |
| Experiment Setup | No | The paper describes the data splits and loss functions, but does not provide specific hyperparameter values such as learning rate, batch size, number of epochs, or optimizer settings for training. |