A Hybrid Approach for Probabilistic Inference using Random Projections

Authors: Michael Zhu, Stefano Ermon

ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We show empirically that by using random projections, we can improve the accuracy of common approximate inference algorithms. 4. Experimental Results 4.1. Experimental Methodology 4.2. Ising grid models 4.3. Real world data
Researcher Affiliation Academia Stanford University, Stanford, CA 94305 USA
Pseudocode Yes Algorithm 1 RP-Inf Alg(G, A, M, m, f, p)
Open Source Code No The paper uses the Lib DAI library (Mooij, 2010) but does not provide specific access to the source code for the methodology described in this paper.
Open Datasets Yes We report our results for two selected datasets, Linkage (genetic linkage) and Promedus (medical diagnosis), in Table 1. These two datasets were chosen because they were the most challenging ones for Gibbs sampling in the UAI dataset. Gogate, Vibhav. UAI 2014 inference competition. http://www.hlt.utdallas.edu/ vgogate/uai14-competition/index.html, 2014.
Dataset Splits No The paper describes the datasets used and the number of iterations for algorithms but does not specify explicit training, validation, or test dataset splits.
Hardware Specification Yes For reference, these running times were obtained from running Lib DAI on a cluster of servers with Intel Xeon E5520 and E5620 processors.
Software Dependencies Yes The probabilistic inference algorithms we will test in this section in conjunction with random projections are Mean Field, Belief Propagation, and Gibbs sampling, as implemented in the Lib DAI library (Mooij, 2010).
Experiment Setup Yes We choose p = 1/2 for our XOR factor potentials... We construct M = 50 random projections of G with XOR lengths l = 1, 2, 4 and number of XOR constraints m = 20. We use 10 random initializations for Mean Field on each randomly projected G(k)... We construct M = 1,000 random projections of G with XOR length l = 4 and number of XOR constraints m = 20. We run up to 10,000 iterations of Gibbs sampling and Belief Propagation on each G(k)...