Algorithmic Bayesian Persuasion with Combinatorial Actions
Authors: Kaito Fujii, Shinsaku Sakaue5016-5024
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We first show that constant-factor approximation is NP-hard even in some special cases of matroids or paths. We then propose a polynomial-time algorithm for general matroids by assuming the number of states of nature to be a constant. We finally consider a relaxed notion of persuasiveness, called CCE-persuasiveness, and present a sufficient condition for polynomial-time approximability. |
| Researcher Affiliation | Academia | Kaito Fujii1, Shinsaku Sakaue2 1 National Insititute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan, 2 The Univeristy of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan fujiik@nii.ac.jp, sakaue@mist.i.u-tokyo.ac.jp |
| Pseudocode | Yes | Algorithm 1: Algorithm for Bayesian persuasion with a general matroid constraint; Algorithm 2: Algorithm for CCE-persuasiveness |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and focuses on algorithmic design and hardness proofs; it does not utilize or refer to any datasets for training. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical validation on datasets, and therefore does not specify dataset splits for validation. |
| Hardware Specification | No | The paper is theoretical and does not report on computational experiments, thus no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies with version numbers used for implementation or experimentation. |
| Experiment Setup | No | The paper is theoretical and describes algorithms and proofs; it does not include details on experimental setup, hyperparameters, or training configurations. |