GraphCroc: Cross-Correlation Autoencoder for Graph Structural Reconstruction

Authors: Shijin Duan, Ruyi Ding, Jiaxing He, Aidong Ding, Yunsi Fei, Xiaolin Xu

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Both theoretical analysis and numerical evaluations demonstrate that our methodology significantly outperforms existing self-correlation-based GAEs in graph structure reconstruction.
Researcher Affiliation Academia Shijin Duan Ruyi Ding Jiaxing He Aidong Adam Ding Yunsi Fei Xiaolin Xu Northeastern University {duan.s, ding.ruy, he.jiaxi, a.ding, y.fei, x.xu}@northeastern.edu
Pseudocode No The paper describes methods in detail but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Our code is available in https://github.com/sjduan/Graph Croc.
Open Datasets Yes Dataset We assess Graph Croc in various graph tasks. Specifically, we utilize datasets for molecule, scaling from small (PROTEINS [2]) to large (Protein-Protein Interactions (PPI) [12], and QM9 [29]), for scientific collaboration (COLLAB [46]), and for movie collaboration (IMDB-Binary [46]).
Dataset Splits No The paper mentions training and testing but does not explicitly provide details for a separate validation split (e.g., specific percentages or sample counts for a validation set).
Hardware Specification Yes For example, due to the large graph size, the default setting (vector dimension of 128 and layer number of 4) in EGNN when reproducing the PPI task will cause the out-of-memory issue on the 40GB A100 GPU.
Software Dependencies No The paper describes the implementation details and dependencies (e.g., GNN models, optimizers) but does not provide specific version numbers for any software libraries or packages.
Experiment Setup Yes Table 6: The architecture and training configuration of Graph Croc on selected graph tasks. input dim. embedding dim. # layers pooling rate training config. (opt., lr, epochs)