Geom-GCN: Geometric Graph Convolutional Networks

Authors: Hongbin Pei, Bingzhe Wei, Kevin Chen-Chuan Chang, Yu Lei, Bo Yang

ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results show the proposed Geom-GCN achieved state-of-the-art performance on a wide range of open datasets of graphs.
Researcher Affiliation Academia 1College of Computer Science and Technology, Jilin University, China 2Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, USA 3Department of Computer Science, University of Illinois at Urbana-Champaign, USA 4Department of Computing, Hong Kong Polytechnic University, Hong Kong 5Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, China
Pseudocode No The paper describes the model using mathematical equations but does not include explicit pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statement or link for the open-sourcing of its code.
Open Datasets Yes We utilize nine open graph datasets to validate the proposed Geom-GCN. ... Cora, Citeseer, and Pubmed are standard citation network benchmark datasets (Sen et al., 2008; Namata et al., 2012). ... Web KB1 is a webpage dataset collected from computer science departments of various universities by Carnegie Mellon University. We use the three subdatasets of it, Cornell, Texas, and Wisconsin, where nodes represent web pages, and edges are hyperlinks between them. http://www.cs.cmu.edu/afs/cs.cmu.edu/project/theo-11/www/wwkb
Dataset Splits Yes For all graph datasets, we randomly split nodes of each class into 60%, 20%, and 20% for training, validation and testing.
Hardware Specification No The paper compares running times of different models but does not specify any hardware details (e.g., CPU, GPU models, or memory) used for the experiments.
Software Dependencies No The paper mentions software like Adam optimizer and activation functions (ReLU, ELU), but does not provide specific version numbers for any software or libraries.
Experiment Setup Yes We specify the dimension of embedding space as two, and use the relationship operator τ defined in Table 1, and apply mean and concatenation as the low- and high-level aggregation function, respectively. ... The searching hyperparameters include number of hidden unit, initial learning rate, weight decay, and dropout. We fix the number of layer to 2 and use Adam optimizer (Kingma & Ba, 2014) for all models. We use Re LU as the activation function for Geom-GCN and GCN, and ELU for GAT. ... The final hyper-parameter setting is dropout of p = 0.5, initial learning rate of 0.05, patience of 100 epochs, weight decay of 5E-6 (Web KB datasets) or 5E-5 (the other all datasets).