Stein Variational Gradient Descent as Gradient Flow

Authors: Qiang Liu

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper develops the first theoretical analysis on SVGD. We establish that the empirical measures of the SVGD samples weakly converge to the target distribution, and show that the asymptotic behavior of SVGD is characterized by a nonlinear Fokker-Planck equation known as Vlasov equation in physics. We develop a geometric perspective that views SVGD as a gradient flow of the KL divergence functional under a new metric structure on the space of distributions induced by Stein operator.
Researcher Affiliation Academia Qiang Liu Department of Computer Science Dartmouth College Hanover, NH 03755 qiang.liu@dartmouth.edu
Pseudocode Yes Algorithm 1 Stein Variational Gradient Descent [1]
Open Source Code No The paper is theoretical and does not mention releasing source code for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments with datasets, so no dataset availability information for training is provided.
Dataset Splits No The paper is theoretical and does not conduct experiments with datasets, so no validation split information is provided.
Hardware Specification No The paper is theoretical and does not describe experiments, so no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not describe experiments, so no specific software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and does not describe experiments, so no experimental setup details like hyperparameters or training settings are provided.