Tractable Operations for Arithmetic Circuits of Probabilistic Models

Authors: Yujia Shen, Arthur Choi, Adnan Darwiche

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We empirically illustrate the advantages of PSDDs compared to other AC representations, for compiling probabilistic graphical models.
Researcher Affiliation Academia Yujia Shen and Arthur Choi and Adnan Darwiche Computer Science Department University of California Los Angeles, CA 90095 {yujias,aychoi,darwiche}@cs.ucla.edu
Pseudocode Yes Algorithm 1 Multiply(n1, n2, v)
Open Source Code No The paper mentions third-party tools ('minic2d package', 'ace system') and provides their links, but does not state that the authors' own implementation code for the described methodology is publicly available.
Open Datasets Yes The benchmarks in Table 1 are from the UAI-14 Inference Competition.8 http://www.hlt.utdallas.edu/~vgogate/uai14-competition/index.html
Dataset Splits No The paper discusses compiling Markov networks and evaluating their size and time, but it does not provide specific details on how the datasets were split into training, validation, or test sets. The focus is on compilation, not model training.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU, GPU models, memory) used for running its experiments, only reporting compilation times.
Software Dependencies No The paper mentions using the 'minic2d package' and the 'ace system', but it does not provide specific version numbers for these or any other software dependencies.
Experiment Setup No The paper describes the algorithmic steps for compiling probabilistic graphical models (e.g., constructing a vtree, compiling factors, multiplying PSDDs), but it does not provide specific experimental setup details such as hyperparameters or system-level training settings as would typically be found in a machine learning paper.