DrugCLIP: Contrastive Protein-Molecule Representation Learning for Virtual Screening
Authors: Bowen Gao, Bo Qiang, Haichuan Tan, Yinjun Jia, Minsi Ren, Minsi Lu, Jingjing Liu, Wei-Ying Ma, Yanyan Lan
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments show that Drug CLIP significantly outperforms traditional docking and supervised learning methods on diverse virtual screening benchmarks with highly reduced computation time, especially in zero-shot setting. |
| Researcher Affiliation | Collaboration | Bowen Gao1 , Bo Qiang2 , Haichuan Tan1, Minsi Ren3, Yinjun Jia4, Minsi Lu5, Jingjing Liu1, Wei-Ying Ma1, Yanyan Lan1,6 1Institute for AI Industry Research (AIR), Tsinghua University 2Department of Pharmaceutical Science, Peking University 3Institute of Automation, Chinese Academy of Sciences 4School of Life Sciences, Tsinghua University 5Department of Pharmaceutical Science, Tsinghua University 6Beijing Academy of Artificial Intelligence |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code for Drug CLIP is available at https://github.com/bowen-gao/DrugCLIP. |
| Open Datasets | Yes | We use three datasets for training: PDBBind [45], Bio Lip [47], and Ch EMBL [8]. |
| Dataset Splits | Yes | We use the general set for training and the refined set for validation. |
| Hardware Specification | Yes | We have a batch size of 192, and we use 4 NVIDIA A100 GPU cards for acceleration. |
| Software Dependencies | No | The paper mentions software like RDkit and Uni Mol, but does not provide specific version numbers for these or other software dependencies. |
| Experiment Setup | Yes | We train our model using the Adam optimizer with a learning rate of 0.001. The other hyper-parameters are set to their default values. We have a batch size of 192, and we use 4 NVIDIA A100 GPU cards for acceleration. We train our model for a maximum of 200 epochs. |