Recurrent Graph Neural Networks and Their Connections to Bisimulation and Logic

Authors: Maximilian Pflueger, David Tena Cucala, Egor V. Kostylev

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we start to fill this gap and study the foundations of GNNs that can perform more than a fixed number of message-passing iterations. We first formalise two generalisations of the basic GNNs: recurrent GNNs (Rec GNNs), which repeatedly apply message-passing iterations until the node classifications become stable, and graph-size GNNs (GSGNNs), which exploit a built-in function of the input graph size to decide the number of message-passings. We then formally prove that GNN classifiers are strictly less expressive than Rec GNN ones, and Rec GNN classifiers are strictly less expressive than GSGNN ones. To get this result, we identify novel semantic characterisations of the three formalisms in terms of suitable variants of bisimulation, which we believe have their own value for our understanding of GNNs. Finally, we prove syntactic logical characterisations of Rec GNNs and GSGNNs analogous to the logical characterisation of plain GNNs, where we connect the two formalisms to monadic monotone fixpoint logic a generalisation of first-order logic that supports recursion.
Researcher Affiliation Academia 1University of Oxford, United Kingdom 2University of Oslo, Norway maximilian.pflueger@cs.ox.ac.uk, david.tena.cucala@cs.ox.ac.uk, egork@ifi.uio.no
Pseudocode No The paper is theoretical and focuses on formal definitions, theorems, and proofs. It does not include any pseudocode or algorithm blocks.
Open Source Code No The paper is theoretical and does not present an implementation or release source code for the described formalisms.
Open Datasets No The paper is theoretical and does not describe empirical experiments, dataset usage, or public dataset availability.
Dataset Splits No The paper is theoretical and does not describe empirical experiments with data splits.
Hardware Specification No The paper is theoretical and does not describe any experimental setup that would require hardware specifications.
Software Dependencies No The paper is theoretical and does not describe any experimental setup or provide software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup, hyperparameters, or training configurations.