Two Influence Maximization Games on Graphs Made Temporal
Authors: Niclas Boehmer, Vincent Froese, Julia Henkel, Yvonne Lasars, Rolf Niedermeier, Malte Renken
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | As a first step, we investigate the existence of Nash equilibria in competitive diffusion and Voronoi games on different temporal graph classes. Even when restricting our studies to temporal paths and cycles, this turns out to be a challenging undertaking, revealing significant differences between the two games in the temporal setting. Notably, both games are equivalent on static paths and cycles. Our two main technical results are (algorithmic) proofs for the existence of Nash equilibria in temporal competitive diffusion and temporal Voronoi games when the edges are restricted not to disappear over time. |
| Researcher Affiliation | Academia | Technische Universit at Berlin, Algorithmics and Computational Complexity {niclas.boehmer, vincent.froese}@tu-berlin.de, {henkel, y.lasars}@campus.tu-berlin.de, {rolf.niedermeier, m.renken}@tu-berlin.de |
| Pseudocode | No | No explicit pseudocode or algorithm blocks found. |
| Open Source Code | No | Due to lack of space, we defer several proofs (marked ) to a full version available at arxiv.org/abs/2105.05987. |
| Open Datasets | No | The paper describes theoretical work and does not use or reference any datasets for training. |
| Dataset Splits | No | The paper describes theoretical work and does not use or reference any datasets, thus no validation splits are mentioned. |
| Hardware Specification | No | The paper describes theoretical work and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | The paper describes theoretical work and does not mention specific software dependencies with version numbers. |
| Experiment Setup | No | The paper describes theoretical work and does not detail any experimental setup, hyperparameters, or training configurations. |