CuCo: Graph Representation with Curriculum Contrastive Learning
Authors: Guanyi Chu, Xiao Wang, Chuan Shi, Xunqiang Jiang
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on fifteen graph classification real-world datasets, as well as the parameter analysis, well demonstrate that our proposed Cu Co yields truly encouraging results in terms of performance on classification and convergence. |
| Researcher Affiliation | Academia | Guanyi Chu , Xiao Wang , Chuan Shi and Xunqiang Jiang Beijing University of Posts and Telecommunications {cgy463, xiaowang, shichuan, skd621}@bupt.edu.cn |
| Pseudocode | Yes | Algorithm 1 Training procedure of Cu Co |
| Open Source Code | No | The paper does not provide any explicit statement about open-sourcing its code, nor does it include a link to a code repository. |
| Open Datasets | Yes | We evaluate model performance on seven classical graph classification benchmarks shown in Table 1... we evaluate model performance on eight Open Graph Benchmark (OGB) [Weihua et al., 2020b] molecule property prediction datasets. |
| Dataset Splits | Yes | We use 10-fold cross validation accuracy to report the classification performance. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory specifications) used for running the experiments. |
| Software Dependencies | No | The paper mentions employing 'graph neural networks (GNNs)' and specifically 'a three-layer Graph Isomorphism Network (GIN)', but it does not specify software dependencies with version numbers (e.g., PyTorch 1.x, TensorFlow 2.x, Python 3.x). |
| Experiment Setup | Yes | For our proposed model, we adopt a three-layer Graph Isomorphism Network (GIN) with 32-dimensional hidden units and a sum pooling readout function for performance comparisons. We use 10-fold cross validation accuracy to report the classification performance. Experiments are repeated 5 times. |