Task-Agnostic Graph Explanations

Authors: Yaochen Xie, Sumeet Katariya, Xianfeng Tang, Edward Huang, Nikhil Rao, Karthik Subbian, Shuiwang Ji

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our extensive experiments show that TAGE can significantly speed up the explanation efficiency by using the same model to explain predictions for multiple downstream tasks while achieving explanation quality as good as or even better than current state-of-the-art GNN explanation approaches.
Researcher Affiliation Collaboration Yaochen Xie Texas A&M University College Station, TX ethanycx@tamu.edu Sumeet Katariya Amazon Search Palo Alto, CA katsumee@amazon.com Xianfeng Tang Amazon Search Palo Alto, CA xianft@amazon.com Edward Huang Amazon Search Palo Alto, CA ewhuang@amazon.com Nikhil Rao Amazon Search Palo Alto, CA nikhilsr@amazon.com Karthik Subbian Amazon Search Palo Alto, CA ksubbian@amazon.com Shuiwang Ji Texas A&M University College Station, TX sji@tamu.edu
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code Yes Our code is publicly available as part of the DIG library2. 2https://github.com/divelab/DIG/tree/main/dig/xgraph/TAGE/.
Open Datasets Yes Molecule Net [29] library provides a collection of molecular graph datasets for the prediction of different molecule properties. Protein-Protein Interaction (PPI) [39] dataset documents the physical interactions between proteins in 24 different human tissues.
Dataset Splits Yes For each real-world dataset, we evaluate explainers on multiple downstream tasks that share a single embedding model... More implementation details are provided in Appendix B.
Hardware Specification Yes All results are obtained from running the explanation on the PPI dataset with 121 node classification tasks with a single Nvidia Tesla V100 GPU.
Software Dependencies No The paper does not explicitly state specific version numbers for its software dependencies (e.g., Python, PyTorch, etc.) that would be necessary for reproduction, beyond a general citation to PyTorch.
Experiment Setup Yes More implementation details are provided in Appendix B.