Fast Lifted MAP Inference via Partitioning

Authors: Somdeb Sarkhel, Parag Singla, Vibhav G. Gogate

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments on several real-world datasets clearly show that our new algorithm is superior to previous approaches and often finds useful symmetries in the search space that existing lifted inference rules are unable to detect. We implemented our algorithm on top of the lifted MAP algorithm of Sarkhel et al. [18], which reduces lifted MAP inference to an integer polynomial program (IPP). We performed two sets of experiments.
Researcher Affiliation Academia Somdeb Sarkhel The University of Texas at Dallas Parag Singla I.I.T. Delhi Vibhav Gogate The University of Texas at Dallas
Pseudocode Yes Algorithm 1 LMAP(MLN M) Algorithm 2 Constrained-Ground (MLN M, Size k and domain equivalence class U) Algorithm 3 Partition-Ground (MLN M, Size k and domain equivalence class U) Algorithm 4 Refine-MAP(MLN M)
Open Source Code No The paper mentions "Alchemy (ALY) [11]" and "Tuffy (TUFFY) [15]" as "open source software packages" that they compare against, and provides a URL for Alchemy. However, it does not state that the code for *their own* proposed algorithm (P-IPP) is open-source or provide a link to its repository.
Open Datasets Yes We used following five MLNs in our experimental study: (1) An MLN which we call Equivalence that consists of following three formulas...; (2) The Student MLN from [18, 19]...; (3) The Relationship MLN from [18]...; (4) Web KB MLN [11] from the Alchemy web page...; and (5) Citation Information-Extraction (IE) MLN from the Alchemy web page [11].... [11] S. Kok, M. Sumner, M. Richardson, P. Singla, H. Poon, D. Lowd, J. Wang, and P. Domingos. The Alchemy System for Statistical Relational AI. Technical report, Department of Computer Science and Engineering, University of Washington, Seattle, WA, 2008. http://alchemy.cs.washington.edu.
Dataset Splits No The paper does not provide specific details on how datasets were split into training, validation, or test sets (e.g., percentages, sample counts, or explicit standard splits).
Hardware Specification Yes All of our experiments were run on a third generation i7 quad-core machine having 8GB RAM.
Software Dependencies No The paper mentions using "Gurobi [8]" as an Integer Linear Programming solver. While it cites the Gurobi Reference Manual, the text itself does not provide a specific version number for Gurobi or any other key software component.
Experiment Setup No The paper describes the algorithms and their modifications but does not provide specific experimental setup details such as hyperparameter values (e.g., learning rates, batch sizes), optimization settings, or initialization procedures used during the experiments.