Proportional Decisions in Perpetual Voting

Authors: Martin Lackner, Jan Maly

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical First, we define two classes of perpetual voting rules that are particularly easy to explain to voters and we explore the bounds imposed by this simplicity. Second, we study proportionality in the perpetual setting and identify two rules with strong proportionality guarantees. However, both rules yield different guarantees and we prove them to be incompatible with each other. We prove that Perpetual Consensus satisfies the upper quota axiom and Perpetual Phragm en the lower quota axiom. In addition, both rules have bounded dry spells. Finally, we show that Perpetual Phragm en satisfies perpetual priceability, an axiom based on work in the multi-winner setting by Peters and Skowron (2020).
Researcher Affiliation Academia 1 TU Wien, Vienna, Austria 2 ILLC, University of Amsterdam, Netherlands lackner@dbai.tuwien.ac.at, j.f.maly@uva.nl
Pseudocode No The paper describes the procedures for rules like Perpetual Phragmen in paragraph form, but does not provide structured pseudocode or clearly labeled algorithm blocks.
Open Source Code No No statement regarding the release of open-source code for the methodology described in this paper was found.
Open Datasets No The paper is theoretical and does not use or reference any datasets for training or evaluation. Examples are illustrative and not based on real data.
Dataset Splits No The paper is theoretical and does not involve experiments with dataset splits, so no validation split information is provided.
Hardware Specification No The paper is theoretical and does not describe any computational experiments, thus no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not discuss implementation details or software dependencies.
Experiment Setup No The paper is theoretical and does not describe any experimental setups, hyperparameters, or training configurations.