Graph Edit Distance with General Costs Using Neural Set Divergence

Authors: Eeshaan Jain, Indradyumna Roy, Saswat Meher, Soumen Chakrabarti, Abir De

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on several datasets, under a variety of edit cost settings, show that GRAPHEDX consistently outperforms state-of-the-art methods and heuristics in terms of prediction error.
Researcher Affiliation Academia EPFL IIT Bombay eeshaan.jain@epfl.ch {saswatmeher,soumen,indraroy15,abir}@cse.iitb.ac.in
Pseudocode Yes In Algorithm 1, we present the pseudocode to generate the optimal edit path given the learnt node and edge alignments from GRAPHEDX.
Open Source Code Yes The code is available at https://github.com/structlearning/Graph Ed X.
Open Datasets Yes We experiment with seven real-world datasets: Mutagenicity (Mutag) [18], Ogbg-Code2 (Code2) [23], Ogbg-Molhiv (Molhiv) [23], Ogbg-Molpcba (Molpcba) [23], AIDS [36], Linux [5] and Yeast [36].
Dataset Splits Yes We divide it into training, validation and test folds with a split ratio of 60:20:20.
Hardware Specification Yes The training of our models and the baselines was performed across servers containing Intel Xeon Silver 4216 2.10GHz CPUs, and Nvidia RTX A6000 GPUs.
Software Dependencies Yes We implement our models using Python 3.11.2 and Py Torch 2.0.0.
Experiment Setup Yes The following hyperparameters are used for training: Adam optimiser with a learning rate of 0.001 and weight decay of 0.0005, batch size of 256, early stopping with patience of 100 epochs, and Sinkhorn temperature set to 0.01.