Group Activity Selection on Social Networks
Authors: Ayumi Igarashi, Dominik Peters, Edith Elkind
AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We show that if multiple groups can simultaneously engage in the same activity, finding a stable outcome is easy as long as the network is acyclic. In contrast, if each activity can be assigned to a single group only, finding stable outcomes becomes intractable, even if the underlying network is very simple: the problem of determining whether a given instance of a GASP admits a Nash stable outcome turns out to be NPhard when the social network is a path, a star, or if the size of each connected component is bounded by a constant. On the other hand, we obtain fixed-parameter tractability results for this problem with respect to the number of activities. |
| Researcher Affiliation | Academia | Ayumi Igarashi, Dominik Peters, Edith Elkind Department of Computer Science University of Oxford, UK {ayumi.igarashi, dominik.peters, edith.elkind}@cs.ox.ac.uk |
| Pseudocode | No | The paper describes algorithms in prose within the proofs of theorems (e.g., Theorem 5), but does not present them in a distinct pseudocode or algorithm block format. |
| Open Source Code | No | The paper mentions an extended version available on arXiv, but does not provide any link or statement regarding the release of source code. |
| Open Datasets | No | The paper is theoretical and does not use or mention any datasets for training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not use or mention data splits. |
| Hardware Specification | No | The paper is theoretical and does not mention hardware specifications for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention software dependencies with specific version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe experimental setup details like hyperparameters or training configurations. |