Torsional Diffusion for Molecular Conformer Generation

Authors: Bowen Jing, Gabriele Corso, Jeffrey Chang, Regina Barzilay, Tommi Jaakkola

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate torsional diffusion by comparing the generated and ground-truth conformers in terms of ensemble RMSD (Section 4.3) and properties (Section 4.4).
Researcher Affiliation Academia 1CSAIL, Massachusetts Institute of Technology 2Dept. of Physics, Harvard University
Pseudocode Yes Algorithm 1: Energy-based training epoch
Open Source Code Yes Code is available at https://github.com/gcorso/torsional-diffusion.
Open Datasets Yes Dataset We evaluate on the GEOM dataset [Axelrod and Gómez-Bombarelli, 2022], which provides gold-standard conformer ensembles generated with metadynamics in CREST [Pracht et al., 2020]. and We used the code and data released by Geo Mol and Geo Diff released under MIT license and the GEOM datasets released under CC0 1.0 license.
Dataset Splits Yes We use the train/val/test splits from Ganea et al. [2021] and use the same metrics to compare the generated and ground truth conformer ensembles:
Hardware Specification No Approximately 2000 GPU-hours on an internal cluster.
Software Dependencies No The paper mentions several software tools and packages used (e.g., RDKit, CREST, GFN2-x TB) but does not provide specific version numbers for the software dependencies of its own implementation.
Experiment Setup Yes Did you specify all the training details (e.g., data splits, hyperparameters, how they were chosen)? [Yes] See Appendix G.