Finding an ϵ-Close Minimal Variation of Parameters in Bayesian Networks
Authors: Bahare Salmani, Joost-Pieter Katoen
IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments show that ϵ-close tuning of large BN benchmarks with up to eight parameters is feasible.Our experiments on our prototypical implementation indicate that ϵ-bounded tuning of up to 8 parameters for large networks with 100 variables is feasible. |
| Researcher Affiliation | Academia | Bahare Salmani and Joost-Pieter Katoen RWTH Aachen University {salmani, katoen}@cs.rwth-aachen.de |
| Pseudocode | Yes | Algorithm 1: Minimal change tuning" and "Algorithm 2: R+-minimal distance instantiation |
| Open Source Code | Yes | 1https://github.com/baharslmn/pbn-epsilon-tuning |
| Open Datasets | Yes | We parametrized benchmarks from bnlearn repository and defined different constraints. |
| Dataset Splits | No | The paper does not provide specific dataset split information (exact percentages, sample counts, citations to predefined splits, or detailed splitting methodology). |
| Hardware Specification | Yes | We conducted all our experiments on a 2.3 GHz Intel Core i5 processor with 16 GB RAM. |
| Software Dependencies | Yes | We empirically evaluated our approach using a prototypical realization on top of the probabilistic model checker Storm [Hensel et al., 2022] (version 1.7.0). |
| Experiment Setup | Yes | The hyperparameters of the algorithm are the coverage factor 0 < η < 1, the region expansion factor 0 < γ < 1, and the maximum number of iterations K N.We took γ=1/2 and K=6 for our experiments, see Sec. 5.4. |