Augmentation-Free Self-Supervised Learning on Graphs

Authors: Namkyeong Lee, Junseok Lee, Chanyoung Park7372-7380

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments towards various node-level tasks, i.e., node classification, clustering, and similarity search on various realworld datasets demonstrate the superiority of AFGRL.
Researcher Affiliation Academia Namkyeong Lee1, Junseok Lee1, Chanyoung Park1,2* 1 Dept. of Industrial and Systems Engineering, KAIST, Daejeon, Republic of Korea 2 Graduate School of Artificial Intelligence, KAIST, Daejeon, Republic of Korea
Pseudocode No The paper describes its method in text and uses diagrams (Figure 2, Figure 4) to illustrate the process, but it does not include a formally labeled pseudocode or algorithm block.
Open Source Code Yes The source code for AFGRL is available at https://github.com/ Namkyeong/AFGRL.
Open Datasets Yes To evaluate AFGRL, we conduct experiments on five widely used datasets, including Wiki CS (Mernyei and Cangea 2020), Amazon (Computers and Photo) (Mc Auley et al. 2015), Coauthor (Co.CS and Co.Physics) (Sinha et al. 2015).
Dataset Splits No For node classification, we use the learned embeddings to train and test a simple logistic regression classifier (Veliˇckovi c et al. 2018). We report the test performance when the performance on validation data gives the best result.
Hardware Specification Yes OOM: Out of memory on 24GB RTX3090
Software Dependencies No The paper mentions models and frameworks used (e.g., GCN, BYOL, K-means) and general software (e.g., logistic regression classifier) but does not provide specific version numbers for any software dependencies.
Experiment Setup Yes We perform grid-search on several hyperparameters, such as learning rate η, decay rate τ, node embedding dimension size D, number of layers of GCN encoder L, for fair comparisons.