Belief Update in the Horn Fragment
Authors: Nadia Creignou, Adrian Haret, Odile Papini, Stefan Woltran
IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our main contribution: a representation result which shows that the class of update operators captured by Horn compliant partial (resp. total) preorders over possible worlds is precisely that given by the adapted and augmented Horn update postulates. With these results at hand, we provide concrete Horn update operators and are able to shed light on Horn revision operators based on partial preorders. |
| Researcher Affiliation | Academia | Nadia Creignou1, Adrian Haret2, Odile Papini1, Stefan Woltran2 1 Aix Marseille Universit e, Universit e de Toulon, CNRS, LIS, Marseille, France 2 TU Wien, Institute of Logic and Computation 192-02, TU Wien, Vienna, Austria |
| Pseudocode | No | The paper focuses on theoretical analysis, postulates, and theorems, and does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statements about releasing open-source code for its methodology, nor does it include a link to a code repository. |
| Open Datasets | No | The paper is theoretical and focuses on logical formalisms and proofs, not empirical evaluation on datasets. Therefore, it does not mention training datasets or their availability. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments with data, thus no validation splits are discussed. |
| Hardware Specification | No | The paper is theoretical and does not describe empirical experiments, so no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not describe empirical experiments, so no software dependencies with version numbers are specified. |
| Experiment Setup | No | The paper is theoretical and does not describe empirical experiments, so no experimental setup details like hyperparameters or training configurations are provided. |