Mean-field theory of graph neural networks in graph partitioning
Authors: Tatsuro Kawamoto, Masashi Tsubaki, Tomoyuki Obuchi
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | A theoretical performance analysis of the graph neural network (GNN) is presented. ... This demonstrates a good agreement with numerical experiments. |
| Researcher Affiliation | Academia | Tatsuro Kawamoto, Masashi Tsubaki Artiļ¬cial Intelligence Research Center, National Institute of Advanced Industrial Science and Technology, 2-3-26 Aomi, Koto-ku, Tokyo, Japan {kawamoto.tatsuro, tsubaki.masashi}@aist.go.jp Tomoyuki Obuchi Department of Mathematical and Computing Science, Tokyo Institute of Technology, 2-12-1 Ookayama Meguro-ku Tokyo, Japan obuchi@c.titech.ac.jp |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper mentions implementing the GNN using Chainer but does not provide any statement or link indicating that their specific implementation code is open-source or publicly available. |
| Open Datasets | No | The paper uses data generated from the Stochastic Block Model (SBM) rather than a pre-existing, publicly available dataset with concrete access information. |
| Dataset Splits | Yes | For the validation (development) set, 100 graph instances of the same SBMs are provided. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory, or cloud instance types) used for running the experiments. |
| Software Dependencies | Yes | We implemented the GNN using Chainer (version 3.2.0) [36]. |
| Experiment Setup | Yes | We set the dimension of the feature space to D = 100 and the number of layers to T = 100, and each result represents the average over 30 samples. ... We also employ residual networks (Res Nets) [38] and batch normalization (BN) [39]. |