The Parameterized Complexity of Cascading Portfolio Scheduling

Authors: Eduard Eiben, Robert Ganian, Iyad Kanj, Stefan Szeider

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper we study the parameterized complexity of this problem and establish its fixed-parameter tractability by utilizing structural properties of the success relation. Our findings are significant as they reveal that in spite of the intractability of the problem in its general form, one can indeed exploit sparseness or density of the success relation to obtain non-trivial runtime guarantees for finding an optimal cascading schedule.
Researcher Affiliation Academia Eduard Eiben Royal Holloway University of London Department of CS UK; Robert Ganian TU Wien Algorithms and Complexity Group Austria; Iyad Kanj De Paul University School of Computing Chicago USA; Stefan Szeider TU Wien Algorithms and Complexity Group Austria
Pseudocode No The paper describes algorithms verbally (e.g., 'Consider an algorithm which loops over...') and provides proof sketches, but does not include structured pseudocode blocks or algorithms explicitly labeled as such.
Open Source Code No The paper does not provide any links to open-source code or state that code for their methods is available.
Open Datasets No The paper defines a conceptual 'training set T of instances' as part of the problem setup, but it does not refer to a specific, named, or publicly accessible dataset used for empirical training in the context of experiments.
Dataset Splits No The paper is theoretical and focuses on complexity analysis and algorithm design rather than empirical evaluation. Therefore, it does not describe dataset splits for training, validation, or testing.
Hardware Specification No As a theoretical paper, there is no mention of specific hardware used for experiments.
Software Dependencies No As a theoretical paper, there is no mention of specific software dependencies with version numbers.
Experiment Setup No The paper does not describe any empirical experiments, and therefore no experimental setup details like hyperparameters or training configurations are provided.