Probabilistic Circuits for Variational Inference in Discrete Graphical Models
Authors: Andy Shih, Stefano Ermon
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We experimentally validate the use of selective-SPNs in variational inference of discrete graphical models. We use Algorithm 1 to construct selective-SPNs, compute the exact ELBO gradient, and optimize the parameters of the selective-SPN (the edges under sum nodes) with gradient descent. ... We present results on computing lower bounds of the log partition function of three sets of graphical models: Ising models, Latent Dirichlet Allocation, and factor graphs from the UAI Inference Competition. |
| Researcher Affiliation | Academia | Andy Shih Computer Science Department Stanford University andyshih@cs.stanford.edu Stefano Ermon Computer Science Department Stanford University ermon@cs.stanford.edu |
| Pseudocode | Yes | Algorithm 1: Constructing a Selective-SPN |
| Open Source Code | Yes | Code can be found at https://github.com/Andy Shih12/SPN_Variational_Inference. |
| Open Datasets | Yes | We study Latent Dirichlet Allocation [1]. Lastly, we consider factor graphs from the 2014 UAI Inference Competition. |
| Dataset Splits | No | The paper mentions evaluating models on specific graphical models and datasets (Ising models, Latent Dirichlet Allocation, UAI Inference Competition factor graphs) but does not provide specific training, validation, or test dataset splits (e.g., percentages, sample counts, or explicit splitting methodology). |
| Hardware Specification | No | The paper states: 'Experiments were run on a single GPU.' This does not provide specific hardware details such as the GPU model, CPU type, or memory, which are necessary for reproducibility. |
| Software Dependencies | No | The paper mentions using 'automatic differentiation tools' and 'lib DAI [28]', but it does not provide specific version numbers for any software components or libraries required to replicate the experiments. |
| Experiment Setup | No | The paper mentions optimizing parameters with gradient descent and that 'k' is an adjustable hyperparameter denoting the size budget, but it does not specify concrete values for learning rates, batch sizes, number of epochs, or the specific values chosen for 'k' during the experiments. It does mention "multiple random restarts" and "5 intermediate distributions" for AIS, and a "30 minutes" runtime limit, but these are not sufficient to cover the overall experimental setup. |