Space Group Constrained Crystal Generation

Authors: Rui Jiao, Wenbing Huang, Yu Liu, Deli Zhao, Yang Liu

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on several popular datasets verify the benefit of the involvement of the space group constraint
Researcher Affiliation Collaboration 1Dept. of Comp. Sci. & Tech., Institute for AI, Tsinghua University 2Institute for AIR, Tsinghua University 3Gaoling School of Artificial Intelligence, Renmin University of China 4Beijing Key Laboratory of Big Data Management and Analysis Methods, Beijing, China 5Alibaba Group
Pseudocode No The paper includes architectural diagrams (Figure 4) and mathematical formulations but no structured pseudocode or algorithm blocks labeled as such.
Open Source Code Yes Our code is available at https://github.com/jiaor17/Diff CSP-PP.
Open Datasets Yes We evaluate our method on four datasets with different data distributions. Perov5 (Castelli et al., 2012) encompasses 18,928 perovskite crystals... Carbon-24 (Pickard, 2020) comprises 10,153 carbon crystals... MP-20 (Jain et al., 2013) contains 45,231 materials... MPTS-52 serves as a more challenging extension of MP-20...
Dataset Splits Yes For Perov-5, Carbon-24 and MP-20, we follow the 60-20-20 split with previous works (Xie et al., 2021). For MPTS-52, we perform a chronological split, allocating 27,380/5,000/8,096 crystals for training/validation/testing.
Hardware Specification No The paper describes training settings and hyperparameters but does not specify the exact hardware components (e.g., specific GPU or CPU models, memory details) used for running experiments.
Software Dependencies No The paper mentions using 'pymatgen' and 'Xenon Py', and describes the implementation in terms of layers and hidden states, but does not specify exact version numbers for software dependencies like Python, PyTorch, or specific libraries.
Experiment Setup Yes We train a denoising model with 6 layers, 512 hidden states, and 128 Fourier embeddings for each task and the training epochs are set to 3500, 4000, 1000, 1000 for Perov-5, Carbon-24, MP-20, and MPTS-52. The diffusion step is set to T = 1000. We utilize the cosine scheduler with s = 0.008 to control the variance of the DDPM process on k and A, and an exponential scheduler with σ1 = 0.005, σT = 0.5 to control the noise scale on F . The loss coefficients are set as λk = λ F = 1, λ A = 20. We apply γ = 2 10 5 for Carbon-24, 1 10 5 for MPTS-52 and 5 10 6 for other datasets for the corrector steps during generation.