Outlier-Robust Gromov-Wasserstein for Graph Data
Authors: Lemin Kong, Jiajin Li, Jianheng Tang, Anthony Man-Cho So
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through extensive experimentation, we validate our theoretical results and demonstrate the effectiveness of RGW on real-world graph learning tasks, such as subgraph matching and partial shape correspondence. |
| Researcher Affiliation | Academia | Lemin Kong CUHK lkong@se.cuhk.edu.hk Jiajin Li Stanford University jiajinli@stanford.edu Jianheng Tang HKUST jtangbf@connect.ust.hk Anthony Man-Cho So CUHK manchoso@se.cuhk.edu.hk |
| Pseudocode | No | The paper describes algorithms (BPALM) using mathematical equations and textual explanations, but it does not include an explicitly labeled "Pseudocode" or "Algorithm" block. |
| Open Source Code | Yes | Our code is available at https://github.com/lmkong020/outlier-robust-GW. |
| Open Datasets | Yes | We evaluate the matching performance of RGW on the TOSCA dataset [6, 34] for partial shape correspondence. [...] The Proteins and Enzymes biological graph databases from [15] are also used [...] Additionally, we also evaluate our methods on the Douban Online-Offline social network dataset... |
| Dataset Splits | No | The paper describes the datasets used and how source graphs are generated (e.g., "sampling connected subgraphs... with a specified percentage of nodes"), but it does not explicitly specify training, validation, or test dataset splits with percentages, absolute counts, or references to predefined splits for reproducibility. |
| Hardware Specification | Yes | All simulations are conducted in Python 3.9 on a high-performance computing server running Ubuntu 20.10, equipped with an Intel(R) Xeon(R) Silver 4214R CPU. |
| Software Dependencies | No | The paper mentions "Python 3.9" and "Ubuntu 20.10" but does not list multiple key software libraries, frameworks, or solvers with their specific version numbers (e.g., PyTorch, TensorFlow, specific optimization libraries). |
| Experiment Setup | Yes | RGW sets τ1 = τ2 = 0.1 and selects the best results from the ranges {0.05, 0.1, 0.2, 0.5} for marginal relaxation parameters and {0.01, 0.05, 0.1, 0.5, 1} for step size parameters (tk, ck, rk). The transport plan initialization uses different approaches depending on the dataset. |