Stein Variational Gradient Descent as Moment Matching
Q. Liu, und D. Wang. (2018)cite arxiv:1810.11693Comment: Conference on Neural Information Processing Systems (NIPS) 2018.
Zusammenfassung
Stein variational gradient descent (SVGD) is a non-parametric inference
algorithm that evolves a set of particles to fit a given distribution of
interest. We analyze the non-asymptotic properties of SVGD, showing that there
exists a set of functions, which we call the Stein matching set, whose
expectations are exactly estimated by any set of particles that satisfies the
fixed point equation of SVGD. This set is the image of Stein operator applied
on the feature maps of the positive definite kernel used in SVGD. Our results
provide a theoretical framework for analyzing the properties of SVGD with
different kernels, shedding insight into optimal kernel choice. In particular,
we show that SVGD with linear kernels yields exact estimation of means and
variances on Gaussian distributions, while random Fourier features enable
probabilistic bounds for distributional approximation. Our results offer a
refreshing view of the classical inference problem as fitting Stein's identity
or solving the Stein equation, which may motivate more efficient algorithms.
Beschreibung
[1810.11693] Stein Variational Gradient Descent as Moment Matching
%0 Journal Article
%1 liu2018stein
%A Liu, Qiang
%A Wang, Dilin
%D 2018
%K bayesian optimization readings variational
%T Stein Variational Gradient Descent as Moment Matching
%U http://arxiv.org/abs/1810.11693
%X Stein variational gradient descent (SVGD) is a non-parametric inference
algorithm that evolves a set of particles to fit a given distribution of
interest. We analyze the non-asymptotic properties of SVGD, showing that there
exists a set of functions, which we call the Stein matching set, whose
expectations are exactly estimated by any set of particles that satisfies the
fixed point equation of SVGD. This set is the image of Stein operator applied
on the feature maps of the positive definite kernel used in SVGD. Our results
provide a theoretical framework for analyzing the properties of SVGD with
different kernels, shedding insight into optimal kernel choice. In particular,
we show that SVGD with linear kernels yields exact estimation of means and
variances on Gaussian distributions, while random Fourier features enable
probabilistic bounds for distributional approximation. Our results offer a
refreshing view of the classical inference problem as fitting Stein's identity
or solving the Stein equation, which may motivate more efficient algorithms.
@article{liu2018stein,
abstract = {Stein variational gradient descent (SVGD) is a non-parametric inference
algorithm that evolves a set of particles to fit a given distribution of
interest. We analyze the non-asymptotic properties of SVGD, showing that there
exists a set of functions, which we call the Stein matching set, whose
expectations are exactly estimated by any set of particles that satisfies the
fixed point equation of SVGD. This set is the image of Stein operator applied
on the feature maps of the positive definite kernel used in SVGD. Our results
provide a theoretical framework for analyzing the properties of SVGD with
different kernels, shedding insight into optimal kernel choice. In particular,
we show that SVGD with linear kernels yields exact estimation of means and
variances on Gaussian distributions, while random Fourier features enable
probabilistic bounds for distributional approximation. Our results offer a
refreshing view of the classical inference problem as fitting Stein's identity
or solving the Stein equation, which may motivate more efficient algorithms.},
added-at = {2020-03-09T18:25:11.000+0100},
author = {Liu, Qiang and Wang, Dilin},
biburl = {https://www.bibsonomy.org/bibtex/2ba5247d2e1ac455573289eb91317957c/kirk86},
description = {[1810.11693] Stein Variational Gradient Descent as Moment Matching},
interhash = {4837a13a830aca7cd818821c8cf6b7ea},
intrahash = {ba5247d2e1ac455573289eb91317957c},
keywords = {bayesian optimization readings variational},
note = {cite arxiv:1810.11693Comment: Conference on Neural Information Processing Systems (NIPS) 2018},
timestamp = {2020-03-09T18:25:11.000+0100},
title = {Stein Variational Gradient Descent as Moment Matching},
url = {http://arxiv.org/abs/1810.11693},
year = 2018
}