Graph Diffusion Transformers for Multi-Conditional Molecular Generation
Authors: Gang Liu, Jiaxin Xu, Tengfei Luo, Meng Jiang
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We extensively validate Graph Di T for multi-conditional polymer and small molecule generation. Results demonstrate the superiority of Graph Di T across nine metrics from distribution learning to condition control for molecular properties. In experiments, we evaluate model performance on one polymer and three small molecule datasets. |
| Researcher Affiliation | Academia | Gang Liu, Jiaxin Xu, Tengfei Luo, Meng Jiang University of Notre Dame {gliu7, jxu24, tluo, mjiang2}@nd.edu |
| Pseudocode | No | Not found. The paper contains architectural diagrams and descriptions but no explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code is provided in the supplementary materials. Data and code will be on Github after publication. |
| Open Datasets | Yes | We have one polymer dataset [40] for materials, featuring three numerical gas permeability conditions: O2Perm, CO2Perm, and N2Perm. For drug design, we create three class-balanced datasets from Molecule Net [46]: HIV, BBBP, and BACE |
| Dataset Splits | Yes | We randomly split the dataset into training, validation, and testing (reference) sets in a 6:2:2 ratio. |
| Hardware Specification | Yes | All experiments can be run on a single A6000 GPU card. |
| Software Dependencies | No | Not found. The paper does not specify software dependencies with version numbers for reproducibility (e.g., Python, PyTorch, or other libraries with their specific versions). |
| Experiment Setup | No | Not found. The paper describes architectural and encoding choices but does not provide specific numerical hyperparameters (e.g., learning rate, batch size, number of epochs) or system-level training settings. |