Message Passing for Collective Graphical Models

Authors: Tao Sun, Dan Sheldon, Akshat Kumar

ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate NLBP with two sets of experiments. First, we evaluate the extent to which NLBP accelerates CGM inference and learning for a benchmark synthetic bird migration problem (Sheldon et al., 2013; Liu et al., 2014). Then, we demonstrate the benefits of a more scalable inference algorithm by evaluating CGMs in a new application: learning with noisy sufficient statistics.
Researcher Affiliation Academia 1University of Massachusetts Amherst, 2Mount Holyoke College, 3Singapore Management University
Pseudocode Yes Algorithm 1: Non-Linear Belief Propagation
Open Source Code No The paper does not provide an explicit statement or link for open-source code related to the methodology described.
Open Datasets No Synthetic data is generated from a chain-structured CGM to simulate migration of a population of M birds from the bottomleft corner to the top-right corner of an ℓ ℓgrid.
Dataset Splits No The paper generates synthetic data and simulates trajectories rather than explicitly defining training, validation, and test splits from a pre-existing dataset.
Hardware Specification No The paper does not provide specific details about the hardware used for running its experiments.
Software Dependencies No The paper mentions using 'MATLAB's interior-point algorithm' but does not specify the version numbers for MATLAB or any other software dependencies.
Experiment Setup Yes In the following experiments, we set M =1000, T =20 and vary grid size L from 5 5 to 19 19. We report results for wtrue =(5, 10, 10, 10). we added Poisson noise y Pois(αn) to the nodes, with detection rate α=1. For the CGM-based algorithms, we ran 250 EM iterations, which was enough for convergence in almost all cases.