Minimization of Limit-Average Automata
Authors: Jakub Michaliszyn, Jan Otop
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our main contribution is an equivalence relation on words characterizing Lim Avg-automata, i.e., the equivalence classes of this relation correspond to states of an equivalent Lim Avg-automaton. ... We show two applications of this relation. First, we present a minimization algorithm for Lim Avgautomata, which returns a minimal Lim Avgautomaton among those equivalent and structurally similar to the input one. Second, we present an extension of Angluin s L -algorithm with syntactic queries, which learns in polynomial time a Lim Avg-automaton equivalent to the target one. ... For every LIMAVG-automaton T, there exists a consistent ω-automaton B, which is (a) unique, and (b) there exists labeling wt of transitions of B with rational numbers such that the resulting LIMAVG-automaton (B, wt) is equivalent to T. Such a labeling wt can be computed in polynomial time in |T|. ... Theorem 9. Active learning deterministic LIMAVGautomata with teacher answering LV-value queries and LV-equivalence queries can be done in time polynomial in the size of the target automaton and the size of teachers responses. |
| Researcher Affiliation | Academia | Jakub Michaliszyn and Jan Otop University of Wrocław {jmi, jotop}@cs.uni.wroc.pl |
| Pseudocode | Yes | Algorithm 1 The learning algorithm. |
| Open Source Code | No | The paper does not include an unambiguous statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | No | The paper is theoretical and focuses on algorithm design and proofs for automaton minimization and learning. It does not describe experiments that use a dataset, public or otherwise, for training or evaluation. Figure 4 is an illustrative example, not a dataset. |
| Dataset Splits | No | The paper is theoretical and does not describe experiments involving dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe empirical experiments, therefore it does not mention specific hardware used for computations. |
| Software Dependencies | No | The paper is theoretical and focuses on algorithmic design and proofs. It does not provide specific software names with version numbers or other dependencies needed to replicate experiments. |
| Experiment Setup | No | The paper is theoretical and describes algorithms and proofs, not empirical experiments. Therefore, it does not provide details about an experimental setup, hyperparameters, or training configurations. |