Nested Graph Neural Networks

Authors: Muhan Zhang, Pan Li

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we study the effectiveness of the NGNN framework for graph classification and regression tasks. In particular, we want to answer the following questions: Q1 Can NGNN reach its theoretical power to discriminate 1-WL-indistinguishable graphs? Q2 How often and how much does NGNN improve the performance of a base GNN? Q3 How does NGNN perform in comparison to state-of-the-art GNN methods in open benchmarks? Q4 How much extra computation time does NGNN incur? We implement the NGNN framework based on the PyTorch Geometric library [51]. Our code is available at https://github.com/muhanzhang/NestedGNN. 5.1 Datasets 5.3 Results and discussion
Researcher Affiliation Academia Muhan Zhang1,2, Pan Li3, 1Institute for Artificial Intelligence, Peking University 2Beijing Institute for General Artificial Intelligence 3Department of Computer Science, Purdue University
Pseudocode No No structured pseudocode or algorithm blocks were found in the paper. The methods are described using natural language and mathematical equations.
Open Source Code Yes Our code is available at https://github.com/muhanzhang/NestedGNN.
Open Datasets Yes To answer Q2, we use the QM9 dataset [52, 53] and the TU datasets [54]. ... To answer Q3, we use two Open Graph Benchmark (OGB) datasets [59], ogbg-molhiv and ogbg-molpcba.
Dataset Splits Yes Table 1: Statistics and evaluation metrics of the QM9 and OGB datasets. Split ratio: 80/10/10 and TU. ... we uniformly use the 10-fold cross validation framework provided by PyTorch Geomtric [66]
Hardware Specification No No specific hardware details (exact GPU/CPU models, processor types, or memory amounts) used for running the experiments are provided. The paper only vaguely mentions 'GPU memory' in the conclusion without further specifications.
Software Dependencies No The paper mentions 'PyTorch Geometric library' but does not specify a version number for it or for any other software dependencies such as Python or other libraries.
Experiment Setup Yes For GNNs, we search the number of message passing layers in {2, 3, 4, 5}. For NGNNs, we similarly search the subgraph height h in {2, 3, 4, 5}... All models have 32 hidden dimensions, and are trained for 100 epochs with a batch size of 128. and For NGNN, we search the subgraph height h in {3, 4, 5}, and the number of layers in {4, 5, 6}. We train the NGNN models for 100 and 150 epochs for ogbg-molhiv and ogbg-molpcba, respectively.