A Hierarchical Training Paradigm for Antibody Structure-sequence Co-design

Authors: Fang Wu, Stan Z. Li

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical experiments show that HTP sets the new state-of-the-art performance in the co-design problem as well as the fix-backbone design. Our research offers a hopeful path to unleash the potential of deep generative architectures and seeks to illuminate the way forward for the antibody sequence and structure co-design challenge. (from Abstract) and 3 Experiments (section title).
Researcher Affiliation Academia Fang Wu Tsinghua University Beijing, China Stan Z. Li Westlake University Hangzhou, China Corresponding Authors, emails: fw2359@columbia.edu.
Pseudocode No The paper describes the model architecture and equations (e.g., Section 2.4 Geometric Graph Neural Networks) but does not include explicit pseudocode blocks or algorithm listings.
Open Source Code Yes All relevant Python code to reproduce the results in our paper is stored in the Git Hub repository at https://github.com/smiles724/HTP.
Open Datasets Yes (2) For the antibody sequence level, we use the Observed Antibody Space database (OAS) [31] and its subsequent update [32] as the pretraining data. ... which can be downloaded from its official website at https://opig. stats.ox.ac.uk/webapps/oas/. ... (4) For the antibody-antigen complex structure level, we select all available antibody-antigen protein complexes from SAb Dab [16] at https://opig.stats.ox.ac.uk/webapps/newsabdab/ sabdab/...
Dataset Splits Yes It results in train/val/test of 87,303/31,050/15,268 complex samples. (for DIPS) and The training and validation splits just include complexes not involved during the curation of the test split. After that, we randomly divided the remaining complexes with a ratio of 90% and 10% into training and validation sets. (for SAb Dab).
Hardware Specification Yes All experiments are run on multiple A100 GPUs, each with a memory storage of 80G.
Software Dependencies No The paper states 'HPT is implemented in Py Torch and Py Torch Geometric packages' but does not provide specific version numbers for these software dependencies.
Experiment Setup Yes For all four training stages, we leverage an Adam optimizer [50] with a weight decay of 1e-5. ... The entire hyperparameter search space is depicted in Table 5.