FAFE: Immune Complex Modeling with Geodesic Distance Loss on Noisy Group Frames
Authors: Ruidong Wu, Ruihan Guo, Rui Wang, Shitong Luo, Yue Xu, Jiahan Li, Jianzhu Ma, Qiang Liu, Yunan Luo, Jian Peng
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | By fine-tuning AF2 with our proposed new loss function, we attain a correct rate of 52.3% (Dock Q > 0.23) on an evaluation set and 43.8% correct rate on a subset with low homology, with substantial improvement over AF2 by 182% and 100% respectively. |
| Researcher Affiliation | Collaboration | 1Helixon 2Tsinghua 3UTAustin 4Ga Tech. |
| Pseudocode | Yes | We provide an example implementation of F2E in Py Torch 2.0.1. (Appendix B.4 contains structured Python code snippets which serve as pseudocode for implementation). |
| Open Source Code | Yes | Code is available at https://github.com/mooninrain/FAFE.git. |
| Open Datasets | Yes | The fine-tuning dataset is collected from The Structural Antibody Database (Sab Dab) (Dunbar et al., 2014), which provides annotations on the original PDB database (Burley et al., 2017) raw structures. |
| Dataset Splits | No | The paper mentions a "training set" and "evaluation set" but does not specify the exact percentages or counts for a training/validation/test split, or reference predefined splits for this partitioning. |
| Hardware Specification | No | The paper mentions "computing resource constraints" but does not specify any particular hardware components like CPU or GPU models used for the experiments. |
| Software Dependencies | Yes | We provide an example implementation of F2E in Py Torch 2.0.1. |
| Experiment Setup | Yes | The batch size is set to 32, which we find is the minimal batch size to improve AF2 performance during fine-tuning. ... The crop length is set to 384 at max. The number of MSA cluster center is restricted to 128 for training efficiency. |