ReGCL: Rethinking Message Passing in Graph Contrastive Learning

Authors: Cheng Ji, Zixuan Huang, Qingyun Sun, Hao Peng, Xingcheng Fu, Qian Li, Jianxin Li

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate the superiority of the proposed method in comparison to state-of-the-art baselines across various node classification benchmarks.
Researcher Affiliation Academia 1Beijing Advanced Innovation Center for Big Data and Brain Computing, Beihang University, China 2School of Computer Science and Engineering, Beihang University, China 3Key Lab of Education Blockchain and Intelligent Technology, Ministry of Education, Guangxi Normal University, China
Pseudocode Yes The detailed algorithm and complexity analysis of Re GCL can be found in Appendix B.
Open Source Code Yes https://github.com/RingBDStack/ReGCL
Open Datasets Yes Cora and Citeseer are citation networks that are widely used as node classification benchmarks (Kipf and Welling 2017), Amazon Photo is the Amazon co-purchase network (Shchur et al. 2018), and Coauthor CS includes the co-authorships of the academic graph (Shchur et al. 2018).
Dataset Splits Yes Please refer to Appendix C for more details on the dataset split and hyperparameter settings.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU, GPU models) used for running its experiments.
Software Dependencies No The paper does not specify software dependencies with version numbers.
Experiment Setup Yes Please refer to Appendix C for more details on the dataset split and hyperparameter settings.