Featured Graph Coarsening with Similarity Guarantees

Authors: Manoj Kumar, Anurag Sharma, Shashwat Saxena, Sandeep Kumar

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments with both real and synthetic benchmark datasets elucidate the proposed framework s efficacy and applicability for numerous graph-based applications, including graph clustering, node classification, stochastic block model identification, and graph summarization. In this section, we demonstrate the effectiveness of the proposed algorithm through a comprehensive set of experiments conducted on both real and synthetic graph data sets.
Researcher Affiliation Academia 1Department of Electrical Engineering, IIT Delhi, New Delhi, India 2Department of Mathematics and Computing, IIT Delhi, New Delhi, India 3Bharti School Of Telecommunications Technology Management, IIT Delhi, New Delhi, India 4Yardi School of Artificial Intelligence, IIT Delhi, New Delhi, India.
Pseudocode Yes Algorithm 1 FGC Algorithm; Algorithm 2 Algorithm for node classification
Open Source Code No The paper does not include an explicit statement or a link to a public repository for the source code of the described methodology.
Open Datasets Yes Datasets: Real and synthetic datasets, dataset(p, m, n) where p is the number of nodes, m is the number of edges and n is the number of features, used in our experiments are: (i) Cora (2708,5278,1433), (ii) Citeseer (3312,4536,3703), (iii) Polblogs (1490,16715,5000), (iv) ACM (3025,13128,1870), (v) Erdos Renyi (ER) (1000, 25010, 5000), (vi) Watts Strogatz (WS) (1000, 5000, 5000), (vii) Barabasi Albert (BA) (1000, 9800, 5000), (viii) Random Geometric Graph (RGG) (1000, 7265, 5000), (ix) Minnesota (2642, 3304, 5000), (x) Yeast (2361, 13292, 5000), (xi) Airfoil (4253, 12289, 5000) and (xii) Bunny (2503, 78292, 5000) (xiii) Pubmed(19717,88648,500) (34493, 247962, 8, 415) (xiv) Coauthor Physics(Co-phy) (34493, 247962, 8, 415) . The features of Polblogs, ER, WS, BA, RGG, Yeast, Minnesota, Airfoil and Bunny are generated as following X N(0, L ) (5), where L is the Laplacian matrix of the given graph. Further details of datasets are in the appendix B.
Dataset Splits Yes All the results are calculated using 10-fold cross-validation.
Hardware Specification No The paper does not provide any specific hardware details (e.g., GPU/CPU models, memory amounts, or detailed computer specifications) used for running its experiments.
Software Dependencies No The paper mentions software like GCN, GAT, and APPNP architectures, but it does not specify version numbers for these or any other software dependencies (e.g., programming languages, libraries, or frameworks).
Experiment Setup Yes The learning rate and decay rate used in the node classification experiments are 0.01 and 0.0001, respectively. Hyperparameters used in FGC algorithms for Cora: (",=500, ",=500, ",=716.5); for Citeseer: (",=500, ",=500, ",=1851.5); for Polblogs: (",=500, ",=500, ",=2500); for ACM: (",=2000, ",=500, ",=935); etc. (details in Appendix B).