Graph Neural Networks for Link Prediction with Subgraph Sketching
Authors: Benjamin Paul Chamberlain, Sergey Shirobokov, Emanuele Rossi, Fabrizio Frasca, Thomas Markovich, Nils Yannick Hammerla, Michael M. Bronstein, Max Hansmire
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental evaluation shows that our methods compares favorably to state-of-the-art both in terms of accuracy and speed. |
| Researcher Affiliation | Collaboration | Benjamin P. Chamberlain Charm Therapeutics Sergey Shirobokov Share Chat AI Emanuele Rossi Imperial College London Fabrizio Frasca Imperial College London Thomas Markovich Twitter Inc. Nils Hammerla Twitter Inc. Michael M. Bronstein University of Oxford Max Hansmire Twitter Inc. |
| Pseudocode | Yes | Algorithm 1 Hyper Log Log: Estimate cardinality; Algorithm 2 Min Hash: Estimate Jaccard Similarity; Algorithm 3 Complete Procedure |
| Open Source Code | Yes | We provide an open source Pytorch library for (sub)graph sketching that generates data sketches via message passing on the GPU. [...] Code and instructions to reproduce the experiments are available at https://github.com/melifluos/subgraph-sketching. |
| Open Datasets | Yes | We report results for the most widely used Planetoid citation networks Cora (Mc Callum et al., 2000), Citeseer (Sen et al., 2008) and Pubmed (Namata et al., 2012) and the OGB link prediction datasets (Hu et al., 2020). |
| Dataset Splits | Yes | OGB datasets have fixed splits whereas for Planetoid, random 70-10-20 percent train-val-test splits were generated. |
| Hardware Specification | Yes | We utilized either AWS p2 or p3 machines with 8 Tesla K80 and 8 Tesla V100 respectively to perform all the experiments in the paper. |
| Software Dependencies | No | Our code is implemented in Py Torch (Paszke et al., 2019), using Py Torch geometric (Fey & Lenssen, 2019). (Does not provide version numbers for PyTorch or PyTorch Geometric) |
| Experiment Setup | Yes | The p parameter used by Hyper Log Log was 8 and the number of permutations used by Min Hashing was 128. All hyperparameters were tuned using Weights and Biases random search. The search space was over hidden dimension (64 512), learning rate (0.0001 0.01) and dropout (0 1), layers (1 3) and weight decay (0 0.001). |