Lifted Message Passing for Hybrid Probabilistic Inference

Authors: Yuqiao Chen, Nicholas Ruozzi, Sriraam Natarajan

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate empirically that our approximate lifting schemes perform comparably to existing state-of-the-art models for Gaussian MLNs, while having the flexibility to be applied to models with arbitrary potential functions.
Researcher Affiliation Academia Yuqiao Chen , Nicholas Ruozzi and Sriraam Natarajan University of Texas at Dallas {yuqiao.chen, nicholas.ruozzi, sriraam.natarajan}@utdallas.edu
Pseudocode Yes Algorithm 1 Lifted Hybrid EPBP
Open Source Code Yes EPBP, LEPBP, C2FEPBP, and LGa BP were implemented in Python 3.6, and all source code is available on Git Hub1. 1Code: github.com/leodd/Hybrid-Lifted-Belief-Propagation
Open Datasets Yes We used groundwater level data extracted from the Republican River Compact Association model [Mc Kusick, 2003], which is a monthly record of the measured head position of 3420 wells over 850 months.
Dataset Splits No The paper describes experiments on probabilistic inference models using evidence and observations, but it does not specify explicit training or validation dataset splits typically used in supervised learning contexts.
Hardware Specification Yes All experiments were performed on a machine with a 2.2 GHz Intel Core i7-8750H CPU and 16 GB of memory.
Software Dependencies Yes EPBP, LEPBP, C2FEPBP, and LGa BP were implemented in Python 3.6
Experiment Setup Yes All message-passing algorithms were run for 15 iterations and sampling-based methods used 20 sampling points for the integral approximations. For coarse-to-fine lifting, we use k-means clustering with k = 2 for evidence group splitting and use dynamic splitting of the threshold which was initially being set to ϵ = max Sa S(avg(v Sa)) and was decreased each iteration until ϵ = 0.