Knowledge-aware Coupled Graph Neural Network for Social Recommendation

Authors: Chao Huang, Huance Xu, Yong Xu, Peng Dai, Lianghao Xia, Mengyin Lu, Liefeng Bo, Hao Xing, Xiaoping Lai, Yanfang Ye4115-4122

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental studies on real-world datasets show the effectiveness of our method against many strong baselines in a variety of settings.
Researcher Affiliation Collaboration 1JD Finance America Corporation, USA 2South China University of Technology, China, 3Peng Cheng Laboratory, China 4Communication and Computer Network Laboratory of Guangdong, China 5VIPS Research, China, 6Case Western Reserve University, USA
Pseudocode No The paper does not contain a clearly labeled pseudocode or algorithm block.
Open Source Code Yes Source codes are available at: https://github.com/xhcdream/KCGN.
Open Datasets No Epinions. This data records the user s feedback over different items from a social network-based review system Epinions (Fan et al. 2019). Yelp. This data is collected from the Yelp platform, in which user-item interactions are differentiated with the same split rubric in Epinions. Furthermore, user s social connections (with common interests) are contained in this data. E-Commerce. It is collected from a commercial e-commerce platform with different types of interactions, i.e., page view, add-to-cart, add-to-favorite and purchase. User s relations are constructed with their co-interact patterns.
Dataset Splits No We follow the evaluation settings in (Chen et al. 2019b; Wu et al. 2019a) and employ the leave-one-out method for generating training and test data instances. To be consistent with (Sun et al. 2019), we associate each positive instance with 99 negative samples. In our evaluations, we employ the early stopping for training termination when the performance degrades for 5 continuous epochs on the validation data.
Hardware Specification No The paper does not explicitly describe the specific hardware (e.g., GPU/CPU models, RAM) used to run its experiments.
Software Dependencies No The KCGN is implemented with Pytorch and Adam optimizer is adopted for hyperparameter estimation. It mentions software names but not specific version numbers.
Experiment Setup Yes The training process is performed with the learning rate range of [0.001, 0.005, 0.01], and the batch size selected from [1024, 2048, 4096, 8192]. The embedding size is tuned from the range of [8, 16, 32, 64]. In our evaluations, we employ the early stopping for training termination when the performance degrades for 5 continuous epochs on the validation data.