LGI-GT: Graph Transformers with Local and Global Operators Interleaving

Authors: Shuo Yin, Guoqiang Zhong

IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate that LGI-GT performs consistently better than previous state-of-the-art GNNs and GTs, while ablation studies show the effectiveness of the proposed LGI scheme and EELA.
Researcher Affiliation Academia College of Computer Science and Technology, Ocean University of China yinshuo@stu.ouc.edu.cn, gqzhong@ouc.edu.cn
Pseudocode Yes Algorithm 1 Updating the embeddings of [CLS]
Open Source Code Yes The source code of LGI-GT is available at https://github.com/shuoyinn/LGI-GT.
Open Datasets Yes Among all the datasets we tested on, ZINC, PATTERN, CLUSTER were from [Dwivedi et al., 2020], whilst ogbg-molpcba and ogbg-code2 were from OGB [Hu et al., 2020a].
Dataset Splits Yes Evaluation metrics and dataset splits were the same as in the original papers for each dataset. ... we took mean std of 10 runs with different random seeds
Hardware Specification No The paper does not specify the hardware (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper does not provide specific version numbers for any software dependencies (e.g., programming languages, libraries, or frameworks).
Experiment Setup Yes On each dataset, we used the same number of hidden dimensions F and number of layers (or blocks) L as GPS. ... To achieve a fair comparison, m = n = 1 were constant (never tuned) across all the datasets.