Conditional Graph Information Bottleneck for Molecular Relational Learning
Authors: Namkyeong Lee, Dongmin Hyun, Gyoung S. Na, Sungwon Kim, Junseok Lee, Chanyoung Park
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on various tasks with real-world datasets demonstrate the superiority of CGIB over state-of-the-art baselines. |
| Researcher Affiliation | Academia | 1KAIST 2POSTECH 3KRICT. Correspondence to: Chanyoung Park <cy.park@kaist.ac.kr>. |
| Pseudocode | No | The paper describes its methodology using text and mathematical equations in Section 4, but it does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at https: //github.com/Namkyeong/CGIB. |
| Open Datasets | Yes | For the molecular interaction prediction task, we use Chromophore dataset (Joung et al., 2020), which is related to three optical properties of chromophores, as well as 5 other datasets, i.e., MNSol (Marenich et al., 2020), Free Solv (Mobley & Guthrie, 2014), Comp Sol (Moine et al., 2017), Abraham (Grubbs et al., 2010), and Combi Solv (Vermeire & Green, 2021), which are related to the solvation free energy of solute. |
| Dataset Splits | Yes | For the molecular interaction prediction task, we evaluate the models under 5-fold cross validation scheme following the previous work (Pathak et al., 2020). The dataset is randomly split into 5 subsets and one of the subsets is used as the test set while the remaining subsets are used to train the model. A subset of the test set is selected as validation set for hyperparameter selection and early stopping. |
| Hardware Specification | Yes | We conduct all the experiments using a 24GB NVIDIA Ge Force RTX 3090. |
| Software Dependencies | No | The paper mentions using specific models like MPNN, GCN, GIN, and Set2Set, and the Adam optimizer, but does not provide specific version numbers for the underlying software libraries or programming languages used for implementation (e.g., Python version, PyTorch/TensorFlow version). |
| Experiment Setup | Yes | Hyperparameter details are described in Appendix E. [...] Table 7: Hyperparameter specifications (d: embedding dim, K: batch size, lr: learning rate, β: beta, τ: temperature). [...] We use the Adam optimizer for model optimization. For molecular interaction task and drug-drug interaction task, the learning rate was decreased on plateau by a factor of 10 1 with the patience of 20 epochs following previous work (Pathak et al., 2020). |