Towards Stable Representations for Protein Interface Prediction
Authors: Ziqi Gao, Zijing Liu, Yu Li, Jia Li
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on various benchmarks demonstrate that ATProt consistently improves the performance for protein interface prediction. Moreover, our method demonstrates broad applicability, performing the best even when provided with testing structures from structure prediction models like ESMFold and Alpha Fold2. |
| Researcher Affiliation | Collaboration | Ziqi Gao1,2, Zijing Liu3*, Yu Li3, Jia Li1,2* 1Hong Kong University of Science and Technology 2Hong Kong University of Science and Technology (Guangzhou) 3 International Digital Economy Academy (IDEA) |
| Pseudocode | No | The paper presents mathematical formulations and describes the framework, but it does not include any explicitly labeled 'Pseudocode' or 'Algorithm' blocks with structured steps. |
| Open Source Code | Yes | The code is provided in https://github.com/ATProt/ATProt. |
| Open Datasets | Yes | We evaluate our method on the complexes from Docking Benchmark 5.5 (DB5.5) [45], a gold standard dataset with high-quality, and Database of Interacting Protein Structures (DIPS) [43], which collects 41,876 complexes mined from PDB [4]. |
| Dataset Splits | Yes | The two datasets are randomly divided into training, validation, and testing sets with the following sizes: 203/25/25 (DB5.5) and 39,937/974/965 (DIPS). |
| Hardware Specification | Yes | The training process takes around 0.5 hours with 1 Nvidia 4090 GPUs with 24GB RAM. |
| Software Dependencies | No | The paper lists hyperparameters in Table 4 but does not specify software dependencies like programming language versions or library versions (e.g., Python 3.x, PyTorch 1.x) that are crucial for reproducibility. |
| Experiment Setup | Yes | The hyper-parameters used in this paper are listed in the following table [Table 4]. For example, Batch size 4, Learning rate 3e-4, Optimizer Adam, Dropout rate 0.2, etc. |