Pocket2Mol: Efficient Molecular Sampling Based on 3D Protein Pockets
Authors: Xingang Peng, Shitong Luo, Jiaqi Guan, Qi Xie, Jian Peng, Jianzhu Ma
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results demonstrate that molecules sampled from Pocket2Mol achieve significantly better binding affinity and other drug properties such as druglikeness and synthetic accessibility. |
| Researcher Affiliation | Collaboration | 1Tsinghua University, Beijing, China 2Heli Xon Limited, Beijing, China 3Department of Computer Science, University of Illinois at Urbana-Champaign, Champaign, USA 4Westlake University, Hangzhou, China 5AIR, Tsinghua University, Beijing, China 6Institute for Artificial Intelligence, Peking University, Beijing, China 7Beijing Institute for General Artificial Intelligence, Beijing, China. |
| Pseudocode | No | The paper describes the generation procedure and model architecture in detail, including figures, but does not provide formal pseudocode or algorithm blocks. |
| Open Source Code | Yes | The codes are available at https: //github.com/pengxingang/Pocket2Mol. |
| Open Datasets | Yes | To evaluate the generation performance of Pocket2Mol, we use the Cross Docked dataset (Francoeur et al., 2020) which contains 22.5 million protein-molecule structures and follow the same data preparation and data splitting as (Luo et al., 2021) and (Masuda et al., 2020). |
| Dataset Splits | Yes | To evaluate the generation performance of Pocket2Mol, we use the Cross Docked dataset (Francoeur et al., 2020) which contains 22.5 million protein-molecule structures and follow the same data preparation and data splitting as (Luo et al., 2021) and (Masuda et al., 2020). We validated the model every 5000 training iterations and the number of total training iterations is 475, 000. |
| Hardware Specification | Yes | All the experiments are conducted on Ubuntu Linux with V100 GPUs. |
| Software Dependencies | Yes | The codes are implemented in Python 3.8 mainly with Pytorch 1.9.0 and our codes are uploaded as Supplementary Material. |
| Experiment Setup | Yes | We trained Pocket2Mol with batch size 8 and initial learning rate 2 × 10−4 and decayed learning rate by a factor of 0.6 if validation loss did not decrease for 8 validation iterations. We validated the model every 5000 training iterations and the number of total training iterations is 475, 000. |