Comparing Graph Transformers via Positional Encodings

Authors: Mitchell Black, Zhengchao Wan, Gal Mishne, Amir Nayyeri, Yusu Wang

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental C. Experiments In this section, we carry out experiments to validate our two main results Theorem 3.8 and Theorem 3.10. Our code is adapted from the Graph GPS module (Rampasek et al., 2022) and subsequent fork from Muller et al. (2024). C.1. Graph Isomorphism: CSL Table 1. Test performance on the CSL dataset of different APEs.
Researcher Affiliation Academia 1School of Electrical Engineering and Computer Science, Oregon State University, Corvallis, Oregon, USA 2Halıcıoˇglu Data Science Institute, University of California San Diego, San Diego, California, USA.
Pseudocode No The paper describes algorithms (e.g., Weisfeiler-Lehman algorithm) in prose rather than structured pseudocode or algorithm blocks.
Open Source Code Yes Code for all experiments can be found at https://github.com/blackmit/comparing_graph_transformers_via_positional_encodings
Open Datasets Yes C.1. Graph Isomorphism: CSL (...) We consider the graph isomorphism benchmark dataset BREC (Wang & Zhang, 2024). (...) In this experiment, we compare RPE-GTs and EGN APE-GTS for graph regression on the small ZINC dataset containing 12k graphs (Dwivedi et al., 2023).
Dataset Splits No The paper refers to 'test performance' on datasets but does not explicitly state the training, validation, and test dataset splits (e.g., percentages or specific counts) needed for reproduction.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, processor types, or memory amounts) used for running experiments are explicitly provided in the paper.
Software Dependencies Yes Table 6. Hyperparameters for ZINC Experiment (...) Python 3.8, PyTorch 1.9, and CUDA 11.1
Experiment Setup Yes Table 6. Hyperparameters for ZINC Experiment (...) # Transformer Layers: 14 # Transformer Heads: 8 # Gaussian Kernels 16 # MLP Layers 2 MLP Hidden Dimension (No Edge Features) 16 MLP Hidden Dimension (Edge Features) 16 EGN APE GTs # Transformer Layers: 8 # Transformer Heads: 8 # EGN Layers 6 EGN Hidden Dim (No Edge Features) 48 EGN Hidden Dim (Edge Features) 64 APE Type Add SPE # Deep Sets Layers 3 Deep Sets Hidden Dimension 64 # Parameters 17217