On the Parameterized Complexity of Polytree Learning

Authors: Niels Grüttemeier, Christian Komusiewicz, Nils Morawietz

IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, we revisit the complexity of POLYTREE LEARNING. We show that POLYTREE LEARNING can be solved in single-exponential FPT time for the number of variables. Moreover, we consider the influence of d, the number of variables that might receive a nonempty parent set in the final DAG on the complexity of POLYTREE LEARNING. We show that POLYTREE LEARNING is presumably not fixed-parameter tractable for d, unlike Bayesian network learning which is fixed-parameter tractable for d. Finally, we show that if d and the maximum parent set size are bounded, then we can obtain efficient algorithms.
Researcher Affiliation Academia Niels Gr uttemeier , Christian Komusiewicz and Nils Morawietz Philipps-Universit at Marburg, Marburg, Germany {niegru, komusiewicz, morawietz}@informatik.uni-marburg.de
Pseudocode Yes Algorithm 1 FPT-algorithm for parameter d + p 1: Input: (N, F, t) and dependent vertices v1, . . . , vd 2: b A0 := { } 3: for i = 1 . . . d do 4: e Ai = S P PF(vi)\{ } b Ai 1 vi P 5: b Ai := Compute Representation( e Ai, (d i) p) 6: return b A b Ad such that (N, b A) is a polytree and score( b A) is maximal
Open Source Code No The paper describes theoretical algorithms and complexity results, but does not provide any information about open-source code for the described methodologies.
Open Datasets No The paper describes theoretical algorithms and complexity results and does not report on experiments using datasets.
Dataset Splits No The paper describes theoretical algorithms and complexity results and does not report on experiments or dataset splits.
Hardware Specification No The paper is theoretical and does not report on computational experiments that would require specific hardware specifications.
Software Dependencies No The paper is theoretical and does not report on computational experiments that would require specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not report on computational experiments or their setup, including hyperparameters or system-level training settings.