Can Graph Neural Networks Count Substructures?

Authors: Zhengdao Chen, Lei Chen, Soledad Villar, Joan Bruna

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We then conduct experiments that support the theoretical results for MPNNs and 2-IGNs. We empirically demonstrate that it can count both subgraphs and induced subgraphs on random synthetic graphs while also achieving competitive performances on molecular datasets.
Researcher Affiliation Academia Zhengdao Chen New York University zc1216@nyu.edu Lei Chen New York University lc3909@nyu.edu Soledad Villar Johns Hopkins University soledad.villar@jhu.edu Joan Bruna New York University bruna@cims.nyu.edu
Pseudocode No The paper describes the models and methods using mathematical equations and textual explanations but does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code Yes 1Code available at https://github.com/leichen2018/GNN-Substructure-Counting.
Open Datasets Yes We evaluate LRP on the molecular prediction datasets ogbg-molhiv [63], QM9 [54] and ZINC [14].
Dataset Splits Yes For the counting tasks, we generate a dataset of 1000 graphs, consisting of 800 training graphs, 100 validation graphs, and 100 testing graphs, all of them of size ranging from 10 to 50 nodes. For ogbg-molhiv, QM9 and ZINC, we follow the settings described in [63, 39, 14] for data splits...
Hardware Specification Yes Each model is trained on 1080ti five times with different random seeds.
Software Dependencies No The paper mentions various GNN models and general software components but does not provide specific version numbers for any libraries, frameworks, or solvers (e.g., Python, PyTorch, TensorFlow, CUDA versions).
Experiment Setup Yes We use Adam optimizer [30] with a learning rate of 0.001 and a weight decay of 10−5... For ogbg-molhiv, the total number of epochs is set to 1000 with a batch size of 32. For QM9, the total number of epochs is set to 2000 with a batch size of 128. For ZINC, the total number of epochs is set to 2000 with a batch size of 128.