EiG-Search: Generating Edge-Induced Subgraphs for GNN Explanation in Linear Time
Authors: Shengyao Lu, Bang Liu, Keith G Mills, Jiao He, Di Niu
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct extensive experiments on a total of seven datasets, demonstrating its superior performance and efficiency both quantitatively and qualitatively over the leading baselines. |
| Researcher Affiliation | Collaboration | 1Department of Electrical and Computer Engineering, University of Alberta 2DIRO, Universit e de Montr eal & Mila, Canada CIFAR AI Chair 3Kirin AI Algorithm & Solution, Huawei. |
| Pseudocode | Yes | Algorithm 1 Linear-Complexity Search for Subgraph |
| Open Source Code | Yes | Our code is available at: https://github.com/sluxsr/Ei G-Search. |
| Open Datasets | Yes | We conduct experiments both on the synthetic dataset BA-2Motifs (Luo et al., 2020), and the real-world datasets MUTAG (Debnath et al., 1991), Mutagenicity (Kazius et al., 2005), NCI1 (Wale & Karypis, 2006). |
| Dataset Splits | Yes | The GNNs are trained with the following data splits: training set (80%), validation set (10%), testing set (10%). |
| Hardware Specification | Yes | All the experiments are conducted on Intel Core i7-10700 Processor and NVIDIA Ge Force RTX 3090 Graphics Card. |
| Software Dependencies | No | The paper mentions using GNN models like GCN and GIN, but does not specify any software libraries (e.g., PyTorch, TensorFlow) or their exact version numbers used in the experiments. |
| Experiment Setup | Yes | All the GNNs contain 3 message-passing layers and a 2-layer classifier, the hidden dimension is 32 for BA-2Motifs, BA-Shapes, and 64 for BA-Community, Tree-grid, MUTAG, Mutagenicity and NCI1. |