Challenges of Generating Structurally Diverse Graphs
Authors: Fedor Velikonivtsev, Mikhail Mironov, Liudmila Prokhorenkova
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically investigate the proposed strategies and show that it is possible to significantly improve the diversity of the generated graphs compared to basic random graph models. In addition to the numerical investigation of diversity measures, we also analyze the distribution of graph structural characteristics and relations between them. Here we also observe significantly improved diversity. Moreover, since we consider diversity measures based on several graph distances, our results shed light on the properties of these graph distances. Indeed, depending on the function we optimize, the structural properties of the generated graphs can vary since graph distances focus on different aspects of graph dissimilarity. |
| Researcher Affiliation | Collaboration | Fedor Velikonivtsev HSE University, Yandex Research fvelikon@yandex-team.ru Mikhail Mironov Yandex Research mironov.m.k@gmail.com Liudmila Prokhorenkova Yandex Research ostroumova-la@yandex-team.ru |
| Pseudocode | No | The paper describes algorithms (Greedy, Genetic, Local Optimization, IGGM) but does not provide formal pseudocode blocks or algorithms labeled as such. |
| Open Source Code | Yes | The code of our experiments is publicly available at https://github.com/Abusagit/ Challenges-on-generating-structurally-diverse-graphs. |
| Open Datasets | No | The paper describes generating graphs using various models (Erdos-Renyi, Preferential Attachment, Holme Kim, etc.) but does not explicitly mention using a publicly available or open dataset that needs to be accessed via a link or citation for training purposes. The data is generated internally. |
| Dataset Splits | No | No specific training/validation/test dataset splits were mentioned, as the data is generated, not loaded from a fixed dataset with predefined splits. |
| Hardware Specification | Yes | The experiments have been conducted on the machine with Intel Core i7-7800X @3.50GHz CPU, 2 NVIDIA Ge Force RTX 2080 Ti GPUs and 126G RAM. |
| Software Dependencies | No | The paper mentions "Network X and igraph Python libraries" and "Discrete Denoising Diffusion model Di Gress (Vignac et al., 2023)" but does not specify version numbers for these software components. |
| Experiment Setup | Yes | In most of the experiments, we generate N = 100 graphs with n = 16 nodes. We also conduct experiments with non-neural algorithms on the set of 100 graphs with size n = 64. For IGGM, the number of generated graphs is limited to 1M since training the graph generative model is time-consuming. In the tables, we use the square brackets to denote the number of computed graph representations for an algorithm or sub-algorithm. For the neural network architecture, we use Discrete Denoising Diffusion Model (Di Gress) (Vignac et al., 2023). We refer to Appendix C.5 for a detailed description of our approach. We denote the number of generated graphs during each generation step by K and the total number of generated graphs by L. It is not strictly required but assumed that L is divisible by K. Another parameter of the algorithm is R < K: R graphs are used to train each graph generative model. In our experiments, we set R = 10^3, K = 10^5, L = 10^6. |