The Complexity of Bayesian Network Learning: Revisiting the Superstructure
Authors: Robert Ganian, Viktoriia Korchemna
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We investigate the parameterized complexity of Bayesian Network Structure Learning (BNSL)... We proceed by showing that this result can be strengthened to a localized version of the feedback edge set, and provide corresponding lower bounds that complement previous results to provide a complexity classification of BNSL w.r.t. virtually all well-studied graph parameters. We then analyze how the complexity of BNSL depends on the representation of the input. In particular, while the bulk of past theoretical work on the topic assumed the use of the so-called non-zero representation, here we prove that if an additive representation can be used instead then BNSL becomes fixed-parameter tractable even under significantly milder restrictions to the superstructure, notably when parameterized by the treewidth alone. |
| Researcher Affiliation | Academia | Robert Ganian and Viktoriia Korchemna Algorithms and Complexity Group, TU Wien {rganian,vkorchemna}@ac.tuwien.ac.at |
| Pseudocode | No | The paper describes algorithmic steps and reduction rules (e.g., 'Reduction Rule 1', 'Reduction Rule 2') in prose, but does not present them in a structured pseudocode or algorithm block format. |
| Open Source Code | No | The paper does not contain any statements about releasing open-source code or links to code repositories for the described methodology. |
| Open Datasets | No | This is a theoretical paper focused on complexity analysis and algorithmic design, and thus does not use or refer to datasets for training. |
| Dataset Splits | No | This is a theoretical paper and does not involve empirical experiments with data, therefore no training, validation, or test splits are discussed. |
| Hardware Specification | No | This is a theoretical paper and does not describe any experiments that would require hardware specifications. |
| Software Dependencies | No | The paper focuses on theoretical complexity analysis and algorithm design. It does not mention any specific software names with version numbers required for replication. |
| Experiment Setup | No | This is a theoretical paper focusing on complexity and algorithm design. It does not describe an experimental setup, hyperparameters, or training configurations. |