Generalization and Representational Limits of Graph Neural Networks
Authors: Vikas Garg, Stefanie Jegelka, Tommi Jaakkola
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | First, we prove that several important graph properties, e.g., shortest/longest cycle, diameter, or certain motifs, cannot be computed by GNNs that rely entirely on local information. Second, we provide the first data dependent generalization bounds for message passing GNNs. This analysis explicitly accounts for the local permutation invariance of GNNs. Our bounds are much tighter than existing VC-dimension based guarantees for GNNs, and are comparable to Rademacher bounds for recurrent neural networks. We outline some key proof steps for our results in the main text, and defer the detailed proofs to the Supplementary material. |
| Researcher Affiliation | Academia | 1CSAIL, MIT. Correspondence to: Vikas Garg <vgarg@csail.mit.edu>, Stefanie Jegelka <stefje@mit.edu>, Tommi Jaakkola <tommi@csail.mit.edu>. |
| Pseudocode | No | The paper provides theoretical analysis, propositions, and lemmas, but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any explicit statements about releasing source code or provide links to a code repository for the methodology described. |
| Open Datasets | No | The paper discusses theoretical limits and generalization bounds of GNNs and does not describe using or providing access information for a specific public dataset for its own research or experiments. |
| Dataset Splits | No | The paper focuses on theoretical analysis and does not include details on dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper does not describe any experimental setup that would require specific hardware, and thus no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies with version numbers required for reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not detail any experimental setup, hyperparameters, or training configurations. |