Sequential Signal Mixing Aggregation for Message Passing Graph Neural Networks

Authors: Mitchell Keren Taraday, Almog David, Chaim Baskin

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental By performing extensive experiments, we show that when combining SSMA with well-established MPGNN architectures, we achieve substantial performance gains across various benchmarks, achieving new state-of-the-art results in many settings.
Researcher Affiliation Academia Mitchell Keren Taraday Department of Computer Science Technion Haifa, Israel butovsky.mitchell@gmail.com Almog David Department of Computer Science Technion Haifa, Israel almogdavid@gmail.com Chaim Baskin School of Electrical and Computer Engineering Ben-Gurion University of the Negev Be er Sheva, Israel chaimbaskin@bgu.ac.il
Pseudocode No The paper describes methods and processes in text and with diagrams (e.g., Figure 3 for SSMA visualization), but it does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code Yes We published our code at https://almogdavid.github.io/SSMA/
Open Datasets Yes We observe significant gains across all benchmarks tested, including the TU datasets [32], open graph benchmark (OGB) [21] datasets, long-range graph benchmarks (LRGB) [16] datasets and the ZINC [19] molecular property prediction dataset achieving state-of-the-art results in many settings.
Dataset Splits Yes As these datasets lack official train/validation/test splits, we employ random 10-fold cross-validation for evaluation. [...] We used the official train/validation/test splits provided by the OGB team.
Hardware Specification Yes We conduct our experiments using Py Torch Geometric as the underlying framework, running them on NVIDIA RTX A5000 GPUs.
Software Dependencies No The paper mentions "Py Torch Geometric as the underlying framework" but does not specify its version or the versions of any other key software libraries or solvers used in the experiments within the paper's text. While the NeurIPS checklist indicates that the code repository specifies versions, this information is not in the paper itself.
Experiment Setup Yes Experimental Setup. We test the effectiveness of SSMA by incorporating it into popular MPGNN architectures. [...] For a detailed discussion on the parameter budget in each experiment, please refer to Appendix C.4. Given the budget for each experiment, we conduct a hyperparameter search (HPS) on SSMA parameters to find the best configuration. We further perform ablation studies to closely examine the effect of each hyperparameter, as detailed in Appendix E. [...] We used the Weights & Biases platform to perform hyperparameter searches (HPS), aiming to identify the optimal configuration for each dataset. [...] MLP compression strength: We search for the optimal compression rate... Neighbor selection method: We search for the best neighbor selection method. [...] Effective neighborhood size: we search for the optimal neighborhood size κ.