Density of States Prediction of Crystalline Materials via Prompt-guided Multi-Modal Transformer

Authors: Namkyeong Lee, Heewoong Noh, Sungwon Kim, Dongmin Hyun, Gyoung S. Na, Chanyoung Park

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on two types of DOS, i.e., Phonon DOS and Electron DOS, with various real-world scenarios demonstrate the superiority of DOSTransformer.
Researcher Affiliation Collaboration Namkyeong Lee1 , Heewoong Noh1 , Sungwon Kim1, Dongmin Hyun2, Gyoung S. Na3, Chanyoung Park1 1 KAIST 2 Yahoo Research 3 KRICT {namkyeong96,heewoongnoh,swkim,cy.park}@kaist.ac.kr dhyun@yahooinc.com, ngs0@krict.re.kr
Pseudocode Yes G Pseudo Code Algorithm 1 shows the pseudocode of DOSTransformer. Algorithm 1: Pseudocode of DOSTransformer.
Open Source Code Yes The source code for DOSTransformer is available at https://github.com/Heewoong Noh/DOSTransformer.
Open Datasets Yes We use the Phonon DOS dataset following the instructions of the official Github repository 3 of a previous work [9]. For Electron DOS dataset, we collect crystalline materials and their electron DOS data from Materials Project (MP) website 4. ... 3https://github.com/zhantaochen/phonondos_e3nn 4https://materialsproject.org/
Dataset Splits Yes For the in-distribution setting, we randomly split the dataset into train/valid/test of 80/10/10%. On the other hand, for the out-of-distribution setting, we split the dataset regarding the structure of the crystals. For both scenarios, we generate training sets with simple crystal structures and a valid/test set with more complex crystal structures
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. It only discusses training and inference times generally.
Software Dependencies No The paper mentions using `scipy` for smoothing DOS values and `E3NN python library` for E3NN implementation but does not specify version numbers for these software components or other libraries like PyTorch, TensorFlow, or Python itself. For example, 'scipy library' and 'E3NN python library' are mentioned without specific versions.
Experiment Setup Yes Detailed hyperparameter specifications are given in Table 7. For the hyperparameters in DOSTransformer, we tune them in certain ranges as follows: number of message passing layers in GNN L in {2, 3, 4}, number of cross-attention layers L1, L3 in {2, 3, 4}, number of self-attention layers L2 in {2, 3, 4}, hidden dimension d in {64, 128, 256}, learning rate η in {0.0001, 0.0005, 0.001}, and batch size B in {1, 4, 8}.