Approximation Ratios of Graph Neural Networks for Combinatorial Problems
Authors: Ryoma Sato, Makoto Yamada, Hisashi Kashima
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, from a theoretical perspective, we study how powerful graph neural networks (GNNs) can be for learning approximation algorithms for combinatorial problems. |
| Researcher Affiliation | Academia | Ryoma Sato1,2 Makoto Yamada1,2,3 Hisashi Kashima1,2 1Kyoto University 2RIKEN AIP 3JST PRESTO |
| Pseudocode | Yes | Algorithm 1 Calculating the embedding of a node using GNNs Algorithm 2 CPNGNN: The most powerful VVC-GNN |
| Open Source Code | No | The paper does not provide any specific links or statements about the release of source code for the described methodologies. |
| Open Datasets | No | The paper is a theoretical work focusing on approximation ratios and does not involve training models on publicly available datasets to generate its main findings. While it discusses general graph properties and features, it does not cite or provide access information for a dataset used in its own research. |
| Dataset Splits | No | The paper is theoretical and does not describe a process of training, validation, or testing on specific datasets. |
| Hardware Specification | No | The paper is theoretical and does not describe any experiments that would require specific hardware for execution. |
| Software Dependencies | No | The paper is theoretical and does not mention specific software dependencies or version numbers used for any experimental setup. |
| Experiment Setup | No | The paper is theoretical and focuses on mathematical analysis of GNN capabilities rather than empirical experiments. Therefore, it does not provide details on experimental setup such as hyperparameters or system-level training settings. |