Capsule Graph Neural Network

Authors: Zhang Xinyi, Lihui Chen

ICLR 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our extensive evaluations with 10 graph-structured datasets demonstrate that Caps GNN has a powerful mechanism that operates to capture macroscopic properties of the whole graph by data-driven. It outperforms other SOTA techniques on several graph classification tasks, by virtue of the new instrument.
Researcher Affiliation Academia Zhang Xinyi, Lihui Chen School of Electrical and Electronic Engineering Nanyang Technological University, Singapore xinyi001@e.ntu.edu.sg, elhchen@ntu.edu.sg
Pseudocode Yes Algorithm 1 Dynamic routing mechanism returns parent capsules H given children capsules S, a set of trainable transform matrices W and the number of iterations t.
Open Source Code No The paper provides a link in footnote 2 to a website with visualizations, but not the source code for the Caps GNN method itself. "More details of graph distribution can be found at https://sites.google.com/view/ capsgnn/home where we provide more figures generated from COLLAB and REDDIT-MULTI-12K."
Open Datasets Yes Five biological graph datasets: MUTAG, ENZYMES, NCI1, PROTEINS, D&D and five social network datasets: COLLAB, IMDB-B, IMDB-M, RE-M5K, RE-M12K (Yanardag & Vishwanathan, 2015) are used for our experimental study. Details of these datasets can be found in Appendix B.
Dataset Splits Yes We applied 10-fold cross validation to evaluate the performance objectively. Each time we use 1 training fold as validation fold to adjust hyper-parameters, 8 training fold to train the architecture and the remained 1 testing fold to test the performance.
Hardware Specification No The paper does not provide any specific hardware details such as GPU/CPU models or memory used for experiments.
Software Dependencies No The paper does not mention specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes The same architecture settings are used in Caps GNN for all datasets to show its robust performance. For the node capsules extraction, the GCN has 5 layers (L = 5), the number of channels at each layer is set as the same which is 2 (Cl = 2). The number of graph capsules is fixed as 16 (P = 16). The dimension of all capsules are set as 8 (d = 8). The number of units in the hidden layer of Attention Module is set as 1 16 of the number of input units. The number of iterations in routing is set as 3. During training stage, we simultaneously reduce Lossc and Lossr and we scale Lossr with 0.1 so that the model focuses on classification task. λ is set as 0.5 and 1.0 for multi-class classification and binary classification respectively.