Verifying message-passing neural networks via topology-based bounds tightening

Authors: Christopher Hojny, Shiqiang Zhang, Juan S Campos, Ruth Misener

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To demonstrate the effectiveness of these strategies, we implement an extension to the open-source branch-andcut solver SCIP. We test on both node and graph classification problems and consider topological attacks that both add and remove edges.
Researcher Affiliation Academia 1Eindhoven University of Technology, Eindhoven, The Netherlands 2Department of Computing, Imperial College London, UK.
Pseudocode Yes Algorithm 1 Static bounds tightening (sbt) ... Algorithm 2 Aggressive bounds tightening (abt)
Open Source Code Yes The code is available at Git Hub, also see Hojny & Zhang (2024). Hojny, C. and Zhang, S. SCIP-MPNN: Code for the paper Verifying message-passing neural networks via topologybased bounds tightening . https://doi.org/10. 5281/zenodo.11208355, 2024.
Open Datasets Yes We evaluate the performance of various verification methods on benchmarks including: (i) MUTAG and ENZYMES (Morris et al., 2020) for graph classification, and (ii) Cora and Cite Seer (Yang et al., 2023) for node classification. All datasets are available in Py G and summarized in Table 1.
Dataset Splits No For graph classification, ... 30% of the graphs are used to train the model. For node classification, ... 10% of the nodes are used for training. This only specifies the training split, without providing explicit percentages or counts for validation and test splits.
Hardware Specification Yes All experiments have been conducted on a Linux cluster with 12 Intel Xeon Platinum 8260 2.40 GHz processors each having 48 physical threads.
Software Dependencies Yes All GNNs are built and trained using Py G (Py Torch Geometric) 2.1.0 (Fey & Lenssen, 2019). All MIPs are implemented in C/C++ using the open-source MIP solver SCIP 8.0.4 (Bestuzheva et al., 2023); all LP relaxations are solved using Soplex 6.0.4 (Gamrath et al., 2020). ...Gurobi 10.0.3 (GRBbasic, GRBsbt) (Gurobi Optimization, LLC, 2023).
Experiment Setup Yes All models are trained 200 epochs with learning rate 0.01, weight decay 10 4, and dropout 0.5. ... In our experiments, we choose s {2, 3, 4}, and use δ percentage of the number of edges as the global budget Q, where 1 δ 10. For node classification, ... We set 10 as the global budget and 5 as the local budget.