Heterogeneous Graph Structure Learning for Graph Neural Networks
Authors: Jianan Zhao, Xiao Wang, Chuan Shi, Binbin Hu, Guojie Song, Yanfang Ye4697-4705
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on real-world graphs demonstrate that the proposed framework significantly outperforms the state-of-the-art methods. We conduct extensive experiments on three real-world datasets to validate the effectiveness of HGSL against the state-of-the-art methods. |
| Researcher Affiliation | Collaboration | Jianan Zhao,1,2 Xiao Wang,1 Chuan Shi,1 Binbin Hu,3 Guojie Song,4 Yanfang Ye2 1 School of CS, Beijing University of Posts and Telecommunications, Beijing, China 2 Department of CDS, Case Western Reserve University, OH, USA 3 Ant Group 4 Key Laboratory of Machine Perception, Ministry of Education, Peking University |
| Pseudocode | No | The paper describes the model in detail with equations and figures but does not include any explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code and datasets are publicly available on Github1. 1https://github.com/Andy-Border/HGSL |
| Open Datasets | Yes | DBLP (Yun et al. 2019): This is a subset of DBLP which contains 4,328 papers (P), 2,957 authors (A), and 20 conferences (C)... ACM (Yun et al. 2019): We use the identical datasets and the experimental setting of GTN s. This dataset contains 3,025 papers (P), 5,912 authors (A), and 57 conference subjects (S)... Yelp (Lu et al. 2019): The Yelp dataset contains 2,614 businesses (B), 1,286 users (U), 4 services (S), and 9 rating levels (L). |
| Dataset Splits | Yes | The statistics of these datasets are shown in Table 1: DBLP... # Training 600 # Validation 300 # Test 2057; ACM... # Training 600 # Validation 300 # Test 2125; Yelp... # Training 300 # Validation 300 # Test 2014. |
| Hardware Specification | No | The paper does not specify any hardware details such as GPU or CPU models, memory, or specific computing environments used for the experiments. |
| Software Dependencies | No | The paper mentions general settings like 'the number of layers are set as 2' but does not provide specific software names with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | For all GNN-related models, the number of layers are set as 2. The feature dimension in common space dc and the embedding dimension d for all methods are set as 16 and 64 respectively. We choose the popular metapaths adopted in previous methods... for metapath based models and report the best result. For our proposed model, we use 2head cosine similarity function defined in Equation 3, i.e. K=2. We set learning rate and weight decay as 0.01 and 0.0005 respectively. Other hyper-parameters, namely εF S, εF P , εMP , and α, are tuned by grid search. |