Signed Laplacian Graph Neural Networks

Authors: Yu Li, Meng Qu, Jian Tang, Yi Chang

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results demonstrate that SLGNN outperforms various competitive baselines and achieves state-of-the-art performance.
Researcher Affiliation Academia Yu Li1,8*, Meng Qu2,3, Jian Tang2,4,5 , Yi Chang6,7,8 1 College of Computer Science and Technology, Jilin University, China 2 Mila Qu ebec AI Institute, Canada 3 Univesit e de Montr eal, Canada 4 HEC Montr eal, Canada 5 CIFAR AI Research Chair, Canada 6 School of Artificial Intelligence, Jilin University, China 7 International Center of Future Science, Jilin University, China 8 Engineering Research Center of Knowledge-Driven Human-Machine Intelligence, Ministry of Education, China
Pseudocode No The paper describes the model architecture and mathematical formulations but does not include a clearly labeled pseudocode or algorithm block.
Open Source Code No The paper does not provide an explicit statement about open-sourcing the code for the described methodology, nor does it provide any links to a code repository.
Open Datasets Yes We evaluate SLGNN on four popular signed graphs: Bitcoin Alpha and Bitcoin OTC are who-trusts-whom networks of people who trade on Bitcoin platforms and tag the others trust or distrust. Slashdot is a friendship network of people who tag each other as friends or foes on Slashdot technology-related news website. Epinions is a who-trust-whom network of people give trust or distrust tags on Epinions consumer review site.
Dataset Splits No For each signed graph, we randomly select 20% of the positive and negative links as the test set, while ensuring that the residual signed graph is still connected and used as the training set. The paper specifies a train/test split but does not explicitly mention a separate validation set.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments (e.g., GPU models, CPU types, or memory specifications).
Software Dependencies No The paper mentions using
Experiment Setup Yes For our proposed method SLGNN, we set the numbers of self-gating mechanism to M = 4 for Bitcoin Alpha, Slashdot and Epinions, and M = 2 for Bitcoin OTC, and employ 2 message aggregation layers, with a node representation dropout rate of 0.5, a link coefficient dropout rate of 0.5, and the hidden representation dimension of 64. We use Ada Grad (Duchi, Hazan, and Singer 2011) to optimize SLGNN with a learning rate of 0.01, a weight decay of 0.001.