ASWT-SGNN: Adaptive Spectral Wavelet Transform-Based Self-Supervised Graph Neural Network
Authors: Ruyue Liu, Rong Yin, Yong Liu, Weiping Wang
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on eight benchmark datasets demonstrate that ASWT-SGNN accurately approximates the filter function in high-density spectral regions, avoiding costly eigen-decomposition. Furthermore, ASWT-SGNN achieves comparable performance to state-of-the-art models in node classification tasks. |
| Researcher Affiliation | Academia | Ruyue Liu1,2, Rong Yin1,2*, Yong Liu3, Weiping Wang1 1Institute of Information Engineering, Chinese Academy of Sciences 2University of Chinese Academy of Sciences 3Renmin University of China {liuruyue, yinrong, wangweiping}@iie.ac.cn, liuyonggsai@ruc.edu.cn |
| Pseudocode | No | The paper describes the model and its components using mathematical equations and textual explanations, but it does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not explicitly state that its source code is open-source or provide a link to a code repository for the implemented methodology. |
| Open Datasets | Yes | We evaluate the approach on eight benchmark datasets, which have been widely used in GCL methods. Specifically, citation datasets include Cora, Cite Seer and Pub Med (Yang, Cohen, and Salakhudinov 2016), copurchase and co-author datasets include Photo, Computers, CS and Physics (Suresh et al. 2021). Wikipedia dataset includes Wiki CS (Mernyei and Cangea 2020). |
| Dataset Splits | Yes | The dataset is randomly partitioned, with 20% of nodes allocated to the training set, another 20% to the validation set, and the remaining 60% to the test set. |
| Hardware Specification | Yes | All experiments use Py Torch on a server with four e NVIDIA A40 GPUs. |
| Software Dependencies | No | The paper mentions 'Py Torch' as the software framework used, but it does not specify any version numbers for PyTorch or other key software dependencies. |
| Experiment Setup | Yes | ASWT-SGNN utilizes the Adam Optimizer with a learning rate of 0.001. The specific hyperparameters are as follows: the number of sampling points in the spectral domain, K, is set to 20, the feature update ratio, α, is set to 0.8, and the wavelet terms ratio, β, is set to 0.4. |