Graph WaveNet for Deep Spatial-Temporal Graph Modeling
Authors: Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Chengqi Zhang
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on two public traffic network datasets, METR-LA and PEMS-BAY, demonstrate the superior performance of our algorithm. |
| Researcher Affiliation | Academia | 1Centre for Artificial Intelligence, FEIT, University of Technology Sydney, Australia 2Faculty of Information Technology, Monash University, Australia zonghan.wu-3@student.uts.edu.au, shirui.pan@monash.edu, {guodong.long, jing.jiang, chengqi.zhang}@uts.edu.au |
| Pseudocode | No | The paper does not include a dedicated pseudocode block or algorithm listing. |
| Open Source Code | Yes | The source codes of Graph Wave Net are publicly available from https://github.com/ nnzhan/Graph-Wave Net. |
| Open Datasets | Yes | We verify Graph Wave Net on two public traffic network datasets, METR-LA and PEMS-BAY released by Li et al. [2018b]. |
| Dataset Splits | Yes | The datasets are split in chronological order with 70% for training, 10% for validation and 20% for testing. |
| Hardware Specification | Yes | Our experiments are conducted under a computer environment with one Intel(R) Core(TM) i9-7900X CPU @ 3.30GHz and one NVIDIA Titan Xp GPU card. |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies such as programming languages, libraries, or frameworks (e.g., Python version, PyTorch/TensorFlow version). |
| Experiment Setup | Yes | To cover the input sequence length, we use eight layers of Graph Wave Net with a sequence of dilation factors 1, 2, 1, 2, 1, 2, 1, 2. We use Equation 4 as our graph convolution layer with a diffusion step K = 2. We randomly initialize node embeddings by a uniform distribution with a size of 10. We train our model using Adam optimizer with an initial learning rate of 0.001. Dropout with p=0.3 is applied to the outputs of the graph convolution layer. |