Diversity in Kemeny Rank Aggregation: A Parameterized Approach

Authors: Emmanuel Arrighi, Henning Fernau, Daniel Lokshtanov, Mateus de Oliveira Oliveira, Petra Wolf

IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, we investigate the impact of this combination in the field of Kemeny Rank Aggregation, a well-studied class of problems lying in the intersection of order theory and social choice theory and also in the field of order theory itself. In particular, we show that the Kemeny Rank Aggregation problem is fixed-parameter tractable with respect to natural parameters providing natural formalizations of the notions of diversity and of the notion of a sufficiently good solution. Our main results work both when considering the traditional setting of aggregation over linearly ordered votes, and in the more general setting where votes are partially ordered.
Researcher Affiliation Academia Emmanuel Arrighi1 , Henning Fernau2 , Daniel Lokshtanov3 Mateus de Oliveira Oliveira1 , Petra Wolf 2 1University of Bergen, Norway 2University of Trier, Germany 3University of California Santa Barbara, CA, USA {emmanuel.arrighi, mateus.oliveira}@uib.no, {fernau, wolfp}@informatik.uni-trier.de, {daniello}@ucsb.edu
Pseudocode No The paper describes algorithmic approaches but does not include structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide concrete access to source code.
Open Datasets No The paper is theoretical and does not involve experimental evaluation on datasets.
Dataset Splits No The paper is theoretical and does not involve experimental evaluation requiring dataset splits.
Hardware Specification No The paper is theoretical and does not describe experimental hardware specifications.
Software Dependencies No The paper is theoretical and does not list software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with hyperparameters or training configurations.