Revisiting Graph Contrastive Learning from the Perspective of Graph Spectrum
Authors: Nian Liu, Xiao Wang, Deyu Bo, Chuan Shi, Jian Pei
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | extensive experiments well demonstrate that it can further improve the performances of a wide variety of different GCL methods. |
| Researcher Affiliation | Academia | Nian Liu1, Xiao Wang1, Deyu Bo1, Chuan Shi1,2 , Jian Pei3 1Beijing University of Posts and Telecommunications 2Peng Cheng Laboratory 3Simon Fraser University |
| Pseudocode | Yes | Algorithm 1: Sinkhorn s Iteration |
| Open Source Code | Yes | Corresponding authors. 1Code available at https://github.com/liun-online/Sp Co |
| Open Datasets | Yes | We conduct the node classification on four datasets: Cora, Citeseer [11], Blog Catalog, and Flickr [16]. Details of datasets are in Appendix D.2. |
| Dataset Splits | Yes | For all methods, we use 20 training nodes per class, 500 validation nodes, and 1000 test nodes for Cora, Citeseer and Pubmed datasets, following standard practice [11]. |
| Hardware Specification | Yes | Appendix D.5 Total Compute. All experiments were conducted on a single NVIDIA 3090 GPU. |
| Software Dependencies | Yes | We implement our model in Pytorch 1.8.1 with CUDA 11.1, Python 3.8.5. All experiments were conducted on a single NVIDIA 3090 GPU. |
| Experiment Setup | Yes | Experimental implementation details are given in Appendix D.1. In our experiment, we optimize with Adam optimizer with learning rate 0.0001 and weight decay 0.00001. The maximum training epochs are set to 500 for Cora, Citeseer, Pubmed, and 200 for Blog Catalog, Flickr. We adopt early stopping with patience 20. We set the temperature τ = 0.5 for all models using Info NCE loss. |