Directed Acyclic Graph Structure Learning from Dynamic Graphs
Authors: Shaohua Fan, Shuyang Zhang, Xiao Wang, Chuan Shi
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct extensive simulation experiments with broad range settings which may encounter in real world, validating the effectiveness of our approach in revealing the feature generation mechanism of dynamic graphs. The experiments on real-world datasets well demonstrate the rationality of the relationships inferenced by Graph NOTEARS. |
| Researcher Affiliation | Academia | 1Beijing University of Posts and Telecommunications, China 2Peng Cheng Laboratory, China {fanshaohua, sonyazhang, xiaowang, shichuan}@bupt.edu.cn |
| Pseudocode | Yes | For the detailed pseudocode of Eq. (1), please refer to Appendix A.2 |
| Open Source Code | Yes | Code and data: https://github.com/googlebaba/Graph NOTEARS. |
| Open Datasets | Yes | Code and data: https://github.com/googlebaba/Graph NOTEARS. |
| Dataset Splits | No | The paper describes a temporal splitting strategy ("we use the first T-1 timestamps to predict last T-p timestamps") which serves as a form of train/test split for time-series data, but it does not explicitly provide details for a separate validation split, nor specific percentages or sample counts for general training, validation, and test datasets in the conventional sense. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory, cloud instances) used for running the experiments. |
| Software Dependencies | No | The paper does not specify particular software dependencies or library versions used in the experiments (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | For all methods, we set hyperparameters λW = λP = 0.01. For the weight thresholds, following (Zheng et al. 2018), we choose τW = τP = 0.3 for all the methods. |