Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

The Complexity of Learning Acyclic CP-Nets

Authors: Eisa Alanazi, Malek Mouhoub, Sandra Zilles

IJCAI 2016 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper provides theoretical justification for exact values (or in some cases bounds) of some of the most central information complexity parameters, namely the VC dimension, the (recursive) teaching dimension, the self-directed learning complexity, and the optimal mistake bound, for classes of acyclic CP-nets. We further provide an algorithm that learns tree-structured CP-nets from membership queries. Using our results on complexity parameters, we can assess the optimality of our algorithm as well as that of another query learning algorithm for acyclic CP-nets presented in the literature.
Researcher Affiliation Academia Eisa Alanazi, Malek Mouhoub and Sandra Zilles Department of Computer Science University of Regina Regina, SK, Canada EMAIL
Pseudocode Yes Now we have a simple strategy for learning tree CP-nets with membership queries: 1. For every variable vi, ask O(m log(m)) membership queries from the swaps over Ii,j to obtain orders over Ii,j for all j. 2. If at least two of the orders differ, i.e., there is a conflict pair, find the only parent of vi by log(n) further queries, else Pa(vi) = ;. 3. CPT(vi) is determined by the queries above.
Open Source Code No The paper does not provide concrete access to source code for the methodology described.
Open Datasets No The paper focuses on theoretical complexity results and algorithm design, not empirical experiments with datasets. Therefore, no public dataset information for training is provided.
Dataset Splits No The paper focuses on theoretical complexity results and algorithm design, not empirical experiments with datasets. Therefore, no dataset split information is provided.
Hardware Specification No The paper focuses on theoretical complexity results and algorithm design, not empirical experiments. Therefore, no hardware specifications are mentioned.
Software Dependencies No The paper focuses on theoretical complexity results and algorithm design, not empirical experiments. Therefore, no software dependencies with version numbers are mentioned.
Experiment Setup No The paper focuses on theoretical complexity results and algorithm design, not empirical experiments. Therefore, no specific experimental setup details like hyperparameters or training configurations are provided.