Solving Marginal MAP Problems with NP Oracles and Parity Constraints

Authors: Yexiang Xue, Zhiyuan Li, Stefano Ermon, Carla P. Gomes, Bart Selman

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate our algorithm on unweighted SAT instances and on weighted Markov Random Field models, comparing our algorithm with variational methods, as well as sample average approximation. We also show the effectiveness of our algorithm on applications in computer vision with deep neural networks and in computational sustainability.
Researcher Affiliation Academia Yexiang Xue Department of Computer Science Cornell University yexiang@cs.cornell.edu Zhiyuan Li Institute of Interdisciplinary Information Sciences Tsinghua University lizhiyuan13@mails. Tsinghua.edu.cn Stefano Ermon Department of Computer Science Stanford University ermon@cs.stanford.edu Carla P. Gomes, Bart Selman Department of Computer Science Cornell University {gomes,selman}@cs.cornell.edu
Pseudocode Yes Algorithm 1: XOR_Binary(w : A X {0, 1}, a0, k) ... Algorithm 2: XOR_K(w : A X {0, 1}, k, T) ... Algorithm 3: XOR_MMAP(w : A X {0, 1},n = log2 |X|,m = log2 |A|,T)
Open Source Code No The paper does not provide any statement or link regarding the availability of its source code.
Open Datasets Yes We also apply the Marginal MAP solver to an image completion task. We first learn a two-layer deep belief network [3, 10] from a 14-by-14 MNIST dataset.
Dataset Splits No The paper mentions using the MNIST dataset and applying the solver to an image completion task, but it does not provide explicit details about training, validation, or test dataset splits (e.g., percentages or sample counts).
Hardware Specification No The paper mentions computational constraints like 'a 4G memory limit' and 'a one-hour time and a 4G memory limit', but it does not specify any particular hardware components such as GPU or CPU models.
Software Dependencies No The paper mentions using 'Mixed Integer Programming (MIP)', 'an exact counter ACE [5]', 'Mixed Loopy Belief Propagation (Mixed LBP) [13] implementation', and a 'two-layer deep belief network [3, 10]', but it does not specify any version numbers for these software components or frameworks.
Experiment Setup Yes We tune the constants of our XOR_MMAP so it gives a 2^10 = 1024-approximation (2^-5 sol <= OPT <= 2^5 sol, δ = 10^-3). SAA uses 10,000 samples. For SAA, we use 1,000 samples, which is the largest we can use within the memory limit. All algorithms are given a one-hour time and a 4G memory limit.