Distributed Parameter Estimation in Probabilistic Graphical Models

Authors: Yariv D Mizrahi, Misha Denil, Nando de Freitas

NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper presents foundational theoretical results on distributed parameter estimation for undirected probabilistic graphical models. In this paper we make several theoretical contributions to the design of algorithms for distributed parameter estimation in MRFs by showing how the recent works of Liu and Ihler [13] and of Mizrahi et al. [19] can both be seen as special cases of distributed composite likelihood. Casting these two works in a common framework allows us to transfer results between them, strengthening the results of both works.
Researcher Affiliation Collaboration 1University of British Columbia, Canada 2University of Oxford, United Kingdom 3Canadian Institute for Advanced Research 4Google Deep Mind
Pseudocode No The paper presents theoretical results, definitions, and theorems but does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any concrete access information for open-source code related to the methodology described.
Open Datasets No This is a theoretical paper and does not mention using or providing access to a public dataset for training.
Dataset Splits No This is a theoretical paper and does not discuss validation dataset splits.
Hardware Specification No The paper is theoretical and does not mention any specific hardware used for experiments.
Software Dependencies No The paper is theoretical and does not specify any software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations.