Equivariant Diffusion for Molecule Generation in 3D

Authors: Emiel Hoogeboom, Vı́ctor Garcia Satorras, Clément Vignac, Max Welling

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimentally, the proposed method significantly outperforms previous 3D molecular generative methods regarding the quality of generated samples and efficiency at training time.
Researcher Affiliation Academia 1Uv A-Bosch Delta Lab, University of Amsterdam, Netherlands 2EPFL, Lausanne, Switzerland.
Pseudocode Yes Algorithm 1 Optimizing EDM
Open Source Code No The paper does not contain an explicit statement or link indicating the release of source code for the described methodology.
Open Datasets Yes QM9 (Ramakrishnan et al., 2014) is a standard dataset that contains molecular properties and atom coordinates for 130k small molecules with up to 9 heavy atoms (29 atoms including hydrogens).
Dataset Splits Yes We use the train/val/test partitions introduced in (Anderson et al., 2019), which consists of 100K/18K/13K samples respectively for each partition.
Hardware Specification Yes Training takes approximately 7 days on a single NVIDIA Ge Force GTX 1080Ti GPU.
Software Dependencies No The paper mentions software like 'Adam' (optimizer) and 'fully connected neural networks' and general concepts like 'MLPs' and 'EGNN', but does not provide specific version numbers for any software dependencies.
Experiment Setup Yes All models use 9 layers, 256 features per layer and Si LU activations. They are trained using Adam with batch size 64 and learning rate 10 4.