Bounding errors of Expectation-Propagation

Authors: Guillaume P. Dehaene, Simon Barthelmé

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this article, we prove that the approximation errors made by EP can be bounded. Our bounds have an asymptotic interpretation in the number n of datapoints, which allows us to study EP s convergence with respect to the true posterior. In particular, we show that EP converges at a rate of O(n 2) for the mean, up to an order of magnitude faster than the traditional Gaussian approximation at the mode. We also give similar asymptotic expansions for moments of order 2 to 4, as well as excess Kullback-Leibler cost (defined as the additional KL cost incurred by using EP rather than the ideal Gaussian approximation). All these expansions highlight the superior convergence properties of EP. Our approach for deriving those results is likely applicable to many similar approximate inference methods.
Researcher Affiliation Academia Guillaume Dehaene University of Geneva guillaume.dehaene@gmail.com Simon Barthelmé CNRS, Gipsa-lab simon.barthelme@gipsa-lab.fr
Pseudocode No The paper describes the EP algorithm but does not present it in a formally labeled pseudocode or algorithm block.
Open Source Code No The paper does not provide any statements or links regarding the release of open-source code for the described methodology.
Open Datasets No The paper is theoretical and does not involve the use of datasets for training or evaluation.
Dataset Splits No The paper is theoretical and does not involve the use of datasets, therefore it does not specify training, validation, or test splits.
Hardware Specification No The paper is theoretical and does not describe any experimental hardware specifications.
Software Dependencies No The paper is theoretical and does not mention any specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations.