A Simple and Scalable Representation for Graph Generation
Authors: Yunhui Jang, Seul Lee, Sungsoo Ahn
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct a comprehensive evaluation across ten non-attributed and two molecular graph generation tasks, demonstrating the effectiveness of GEEL. We conduct a comprehensive evaluation across ten non-attributed and two molecular graph generation tasks, demonstrating the effectiveness of GEEL. |
| Researcher Affiliation | Academia | Yunhui Jang1, Seul Lee2, Sungsoo Ahn1 1Pohang University of Science and Technology 2Korea Advanced Institute of Science and Technology {uni5510,sungsoo.ahn}@postech.ac.kr, seul.lee@kaist.ac.kr |
| Pseudocode | No | The paper describes algorithms and procedures in prose, but does not provide structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Reproducibility All experimental code related to this paper is available at https://github.com/yunhuijang/GEEL. |
| Open Datasets | Yes | We validate the general graph generation performance of our GEEL on eight general graph datasets with varying sizes. Four small-sized graphs are: (1) Planar, 200 planar graphs, (2) Lobster, 100 Lobster graphs (Senger, 1997), (3) Enzymes (Schomburg et al., 2004), 587 protein tertiary structure graphs, and (4) SBM, 200 stochastic block model graphs. Four large-sized graphs are: (5) Ego, 757 large Citeseer network dataset (Sen et al., 2008), (6) Grid, 100 2D grid graphs, (7) Proteins, 918 protein graphs, and (8) 3D point cloud, 41 3D point cloud graphs of household objects. We use two molecular datasets: QM9 (Ramakrishnan et al., 2014) and ZINC250k (Irwin et al., 2012). |
| Dataset Splits | Yes | We used the same split with GDSS (Jo et al., 2022) for Enzymes and Grid datasets, with Di Gress (Vignac et al., 2022) for Planar and SBM datasets, with Bi GG (Dai et al., 2020) for Lobster, Proteins, and 3D point cloud datasets, and with Graph RNN (You et al., 2018) for ego dataset. We used the same split with GDSS (Jo et al., 2022) for a fair evaluation. |
| Hardware Specification | Yes | We used Pytorch (Paszke et al., 2019) to implement GEEL and trained the LSTM (Hochreiter & Schmidhuber, 1997) models on Ge Force RTX 3090 GPU. Note that we used A100-40GB for the 3D point cloud dataset. In addition, due to the CUDA compatibility issue of Bi GG (Dai et al., 2020), we used Ge Force GTX 1080 Ti GPU and 40 CPU cores for all models for inference time evaluation in Figure 1b and Table 2. |
| Software Dependencies | No | The paper mentions 'Pytorch' but does not specify a version number. No other key software components are mentioned with version numbers. |
| Experiment Setup | Yes | Further details regarding our experimental setup are in Appendix A. In this section, we provide the details of the experiments. Table 7: Hyperparameters of GEEL in general graph generation. Table 8: Default hyperparameters of GEEL. |