Multi-Scale Subgraph Contrastive Learning

Authors: Yanbei Liu, Yu Zhao, Xiao Wang, Lei Geng, Zhitao Xiao

IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments and parametric analysis on eight graph classification real-world datasets well demonstrate the effectiveness of the proposed method.
Researcher Affiliation Academia 1School of Life Sciences, Tiangong University 2School of Electronics and Information Engineering, Tiangong University 3School of Software, Beihang University
Pseudocode Yes Algorithm 1 The training process of the MSSGCL
Open Source Code No The paper does not provide any links or explicit statements about the availability of open-source code for the described methodology.
Open Datasets Yes We adopt the TUDataset benchmark [Morris et al., 2020], which contains different types of graphs, i.e., molecules and social networks
Dataset Splits Yes We use 10-fold cross validation accuracy to report classification performance. Experiments are repeated 5 times.
Hardware Specification No The paper does not explicitly describe the specific hardware (e.g., GPU models, CPU types) used for running its experiments.
Software Dependencies No The paper mentions using GIN as the encoder and Adam optimizer but does not specify version numbers for these or other software dependencies.
Experiment Setup Yes In our framework, we set the global view size to be 80% of the whole and the local view size to be 20% of the whole for molecular graphs, and 90% of the global view size and 10% of the local view size for social networks. The measurement function between local views is composed of a 5-layer MLPs with batch normalization and RELU activation functions. Its output is fed into a Sigmoid function, which outputs a scalar to indicate the similarity between two local views. [...] we adopt GIN as the encoder, and a sum pooling is used as the readout function. [...] where λ1 and λ2 are hyper-parameters to balance different loss terms.