Total Variation Graph Neural Networks

Authors: Jonas Berg Hansen, Filippo Maria Bianchi

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results show that our model outperforms other GNNs for vertex clustering and graph classification.
Researcher Affiliation Academia 1Department of Mathematics and Statistics, Ui T the Arctic University of Norway 2NORCE, The Norwegian Research Centre AS.
Pseudocode No The paper describes algorithms and derivations in text and equations but does not include a dedicated pseudocode block or a section explicitly labeled 'Algorithm'.
Open Source Code Yes The code to implement TVGNN is publicly available1. 1https://github.com/Filippo MB/ Total-variation-graph-neural-networks
Open Datasets Yes In the vertex clustering experiment, we considered the citation networks Cora, Pubmed, Citeseer (Yang et al., 2016) and DBLP (Fu et al., 2020). In the graph classification experiment, we analyzed seven TUD datasets (Morris et al., 2020) and two synthetic datasets, Bench-easy and Bench-hard (Bianchi et al., 2022).
Dataset Splits Yes Training and testing are done with a stratified 5-fold train/test split. In addition, 10% of the training set is used as a validation set using a random stratified split.
Hardware Specification Yes The authors gratefully acknowledge the support of Nvidia Corporation with the donation of the RTX A6000 GPUs used to perform the experimental evaluation.
Software Dependencies No The GNN models were implemented using both Spektral2 (Grattarola & Alippi, 2020) and Pytorch Geometric3 (Fey & Lenssen, 2019). The methods for vertex embedding used in the vertex clustering experiment are based on the Karateclub4 (Rozemberczki et al., 2020) implementation.
Experiment Setup Yes The hyperparameters for TVGNN for both the vertex clustering and graph classification tasks are reported in Tab. 5. The parameter ϵ which ensures numerical stability for Γ was set to 1e-3 in all experiments.