On the Universality of Graph Neural Networks on Large Random Graphs
Authors: Nicolas Keriven, Alberto Bietti, Samuel Vaiter
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The code for the numerical illustrations is available at https://github.com/ nkeriven/random-graph-gnn. Figure 3: Illustration of Prop. 9 on Gaussian kernel, but with random edges and two-hop input filtering. The x-axis is the latent variables in X = [ 1, 1]. The y-axis is the output of a SGNN trained to approximate some function f (red curve). |
| Researcher Affiliation | Academia | Nicolas Keriven CNRS, GIPSA-lab, Grenoble, France nicolas.keriven@cnrs.fr Alberto Bietti NYU Center for Data Science, New York, USA alberto.bietti@nyu.edu Samuel Vaiter CNRS, LJAD, Nice, France samuel.vaiter@cnrs.fr |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code for the numerical illustrations is available at https://github.com/ nkeriven/random-graph-gnn. |
| Open Datasets | No | The paper discusses theoretical models of random graphs like SBMs and radial kernels, but does not specify a publicly available named dataset for training purposes. |
| Dataset Splits | No | The paper does not specify dataset splits (e.g., train/validation/test percentages or counts) needed to reproduce the experiment. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., CPU, GPU models) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependency details, such as library names with version numbers, needed to replicate the experiment. |
| Experiment Setup | No | The paper does not contain specific experimental setup details, such as concrete hyperparameter values or training configurations, in the main text. |