Dynamic Spiking Graph Neural Networks
Authors: Nan Yin, Mengzhu Wang, Zhenghan Chen, Giulia De Masi, Huan Xiong, Bin Gu
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on three large-scale real-world dynamic graph datasets validate the effectiveness of Dy-SIGN on dynamic node classification tasks with lower computational costs. |
| Researcher Affiliation | Collaboration | Nan Yin1, Mengzhu Wang2, Zhenghan Chen3, Giulia De Masi4, Huan Xiong5,1*, Bin Gu6,1* 1Mohamed bin Zayed University of Artificial Intelligence 2School of Artificial Intelligence, Hebei University of Technology 3Microsoft Corporation 4Technology Innovation Institute 5Harbin Institute of Technology 6Jilin University |
| Pseudocode | Yes | Algorithm 1: Learning Algorithm of Dy-SIGN |
| Open Source Code | No | The paper does not provide any concrete statement or link regarding the availability of its source code. |
| Open Datasets | Yes | We conduct experiments on three large real-world graph datasets, i.e., DBLP (Lu et al. 2019), Tmall (Lu et al. 2019) and Patent (Hall, Jaffe, and Trajtenberg 2001). |
| Dataset Splits | Yes | As for the implementation, we follow the same settings with (Li et al. 2023) and report the Macro-F1 and Micro-F1 results under different training ratios (i.e., 40%, 60%, and 80%). Besides, we use 5% for validation. |
| Hardware Specification | No | The paper mentions 'under the same experiment environment' regarding memory consumption but does not specify any particular hardware components like GPU or CPU models. |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies or libraries used in the experiments. |
| Experiment Setup | Yes | The hidden dimension of all the methods is set to 128, and the batch size to 1024. The total training epochs are 100 and the learning rate is 0.001. |