Inference and Learning with Model Uncertainty in Probabilistic Logic Programs
Authors: Victor Verreet, Vincent Derkinderen, Pedro Zuidberg Dos Martires, Luc De Raedt10060-10069
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically outperform state-of-the-art methods on probabilistic inference tasks in second-order Bayesian networks, digit classification and discriminative learning in the presence of epistemic uncertainty. |
| Researcher Affiliation | Academia | 1 Department of Computer Science, KU Leuven, Belgium 2 Leuven.AI KU Leuven Institute for AI, Belgium 3 Center for Applied Autonomous Systems, Orebro University, Sweden |
| Pseudocode | No | The paper describes algorithms and approaches in textual form, such as "General Inference Approach" and "A Learning Algorithm," but does not include formal pseudocode blocks or algorithms. |
| Open Source Code | Yes | Our implementation can be found on Git Hub |
| Open Datasets | Yes | Our experiments involve three Bayesian networks proposed in (Kaplan and Ivanovska 2018) and five larger networks from the BNLearn repository. Each network is transformed into a Prob Log program using a script available from the Prob Log code repository. |
| Dataset Splits | No | The paper describes how models are constructed for experiments and mentions the number of data points (N) for generating parameters, but it does not specify explicit training, validation, and testing dataset splits (e.g., percentages or counts) for data partitioning. |
| Hardware Specification | Yes | The time scaling experiments were run on Intel Xeon CPU with 2.20GHz. Beta Prob Log additionally took advantage of a Ge Force GTX 1080 Ti GPU. |
| Software Dependencies | No | The paper mentions "implemented with Py Torch (Paszke et al. 2019)" but does not specify a version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | During learning we use the Adam optimizer, a sample count of 1000 and allow for 100 epochs. |