A Non-Asymptotic Analysis of Oversmoothing in Graph Neural Networks
Authors: Xinyi Wu, Zhengdao Chen, William Wei Wang, Ali Jadbabaie
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, we support our theoretical results with numerical experiments, which further suggest that the oversmoothing phenomenon observed in practice can be magnified by the difficulty of optimizing deep GNN models. |
| Researcher Affiliation | Academia | Xinyi Wu1, Zhengdao Chen2 , William Wang1, Ali Jadbabaie1 1Laboratory for Information and Decision Systems (LIDS), MIT 2Courant Institute of Mathematical Sciences, New York University |
| Pseudocode | No | The paper presents theoretical analysis, lemmas, theorems, and experimental results, but does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper mentions using PyTorch and PyTorch Geometric for implementation but does not provide any link to the source code for the methodology described in the paper or explicitly state that their code is open-source. |
| Open Datasets | Yes | We revisit the multi-class node classification task on the three widely used benchmark datasets: Cora, Cite Seer and Pub Med (Yang et al., 2016). |
| Dataset Splits | Yes | We used 60%/20%/20% random splits and ran GNN and APPNP with α = 0.1. |
| Hardware Specification | No | The paper does not explicitly describe the specific hardware (e.g., GPU or CPU models, memory) used for running the experiments. It only mentions the software frameworks used. |
| Software Dependencies | No | The paper states: 'All models were implemented with Py Torch (Paszke et al., 2019) and Py Torch Geometric (Fey & Lenssen, 2019).' However, it does not provide specific version numbers for these software dependencies or other ancillary software components. |
| Experiment Setup | Yes | In all cases we use the Adam optimizer and tune some hyperparameters for better performance. The hyperparameters used are summarized as follows. |