Learning Graphons via Structured Gromov-Wasserstein Barycenters
Authors: Hongteng Xu, Dixin Luo, Lawrence Carin, Hongyuan Zha10505-10513
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our proposed approach overcomes drawbacks of prior state-of-the-art methods, and outperforms them on both synthetic and real-world data. The code is available at https://github.com/ Hongteng Xu/SGWB-Graphon. Experiments Synthetic Data To demonstrate the efficacy of our GWB method and its smoothed variant (SGWB), we compare them with existing methods on learning synthetic graphons. Real-World Data For real-world graph datasets, our mixed GWB method (Mix GWBs) provides a new way to cluster graphs. |
| Researcher Affiliation | Academia | 1Gaoling School of Artificial Intelligence, Renmin University of China 2 Beijing Key Laboratory of Big Data Management and Analysis Methods 3School of Computer Science and Technology, Beijing Institue of Technology 4Department of Electrical and Computer Engineering, Duke University 5School of Data Science, Shenzhen Research Institute of Big Data, The Chinese University of Hong Kong, Shenzhen |
| Pseudocode | Yes | Algorithm 1 Learning Graphons via GWB |
| Open Source Code | Yes | The code is available at https://github.com/ Hongteng Xu/SGWB-Graphon. |
| Open Datasets | Yes | The datasets are the IMDB-BINARY and the IMDB-MULTI (Yanardag and Vishwanathan 2015), which can be downloaded from (Morris et al. 2020). |
| Dataset Splits | Yes | In particular, for each dataset, we apply 10-fold cross-validation to evaluate each clustering method. |
| Hardware Specification | No | The paper does not explicitly describe any specific hardware used to run its experiments. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software components or libraries used. |
| Experiment Setup | Yes | We set the hyperparameters of our methods as follows: the weight of the proximal term β = 0.005, the number of iterations L = 5, and the number of Sinkhorn iterations S = 10; for the SGWB method, the weight of the smoothness regularizer α = 0.0002. |