Provable convergence guarantees for black-box variational inference
Authors: Justin Domke, Robert Gower, Guillaume Garrigos
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We obtain non-asymptotic convergence guarantees for this problem, under simple assumptions. This provides rigorous guarantees that methods similar to those used in practice converge on realistic inference problems. |
| Researcher Affiliation | Collaboration | Justin Domke University of Massachusetts Amherst domke@cs.umass.edu, Guillaume Garrigos Université Paris Cité and Sorbonne Université, CNRS Laboratoire de Probabilités, Statistique et Modélisation F-75013 Paris, France garrigos@lpsm.paris, Robert Gower Center for Computational Mathematics Flatiron Institute, New York rgower@flatironinstitute.org |
| Pseudocode | Yes | Algorithm 1 Prox-SGD with energy estimator and triangular factors, Algorithm 2 Proj-SGD with entropy estimator and symmetric factors, Algorithm 3 Proj-SGD with STL estimator and symmetric factors |
| Open Source Code | No | The paper does not contain any explicit statements about open-sourcing code or links to a code repository. |
| Open Datasets | No | This paper is theoretical and does not describe any experiments that would use datasets. |
| Dataset Splits | No | This paper is theoretical and does not describe any experiments that would specify data splits. |
| Hardware Specification | No | This paper is theoretical and does not describe any experimental setup or the hardware used. |
| Software Dependencies | No | This paper is theoretical and does not mention specific software dependencies with version numbers. |
| Experiment Setup | No | This paper is theoretical and does not describe any experimental setup or hyperparameters. |