Validated Variational Inference via Practical Posterior Error Bounds
J. Huggins, M. Kasprzak, T. Campbell, and T. Broderick. (2019)cite arxiv:1910.04102Comment: A python package for carrying out our validated variational inference workflow -- including doing black-box variational inference and computing the bounds we develop in this paper -- is available at https://github.com/jhuggins/viabel. The same repository also contains code for reproducing all of our experiments.
Abstract
Variational inference has become an increasingly attractive fast alternative
to Markov chain Monte Carlo methods for approximate Bayesian inference.
However, a major obstacle to the widespread use of variational methods is the
lack of post-hoc accuracy measures that are both theoretically justified and
computationally efficient. In this paper, we provide rigorous bounds on the
error of posterior mean and uncertainty estimates that arise from
full-distribution approximations, as in variational inference. Our bounds are
widely applicable, as they require only that the approximating and exact
posteriors have polynomial moments. Our bounds are also computationally
efficient for variational inference because they require only standard values
from variational objectives, straightforward analytic calculations, and simple
Monte Carlo estimates. We show that our analysis naturally leads to a new and
improved workflow for validated variational inference. Finally, we demonstrate
the utility of our proposed workflow and error bounds on a robust regression
problem and on a real-data example with a widely used multilevel hierarchical
model.
Description
[1910.04102] Validated Variational Inference via Practical Posterior Error Bounds
cite arxiv:1910.04102Comment: A python package for carrying out our validated variational inference workflow -- including doing black-box variational inference and computing the bounds we develop in this paper -- is available at https://github.com/jhuggins/viabel. The same repository also contains code for reproducing all of our experiments
%0 Journal Article
%1 huggins2019validated
%A Huggins, Jonathan H.
%A Kasprzak, Mikołaj
%A Campbell, Trevor
%A Broderick, Tamara
%D 2019
%K bayesian bounds inference readings variational
%T Validated Variational Inference via Practical Posterior Error Bounds
%U http://arxiv.org/abs/1910.04102
%X Variational inference has become an increasingly attractive fast alternative
to Markov chain Monte Carlo methods for approximate Bayesian inference.
However, a major obstacle to the widespread use of variational methods is the
lack of post-hoc accuracy measures that are both theoretically justified and
computationally efficient. In this paper, we provide rigorous bounds on the
error of posterior mean and uncertainty estimates that arise from
full-distribution approximations, as in variational inference. Our bounds are
widely applicable, as they require only that the approximating and exact
posteriors have polynomial moments. Our bounds are also computationally
efficient for variational inference because they require only standard values
from variational objectives, straightforward analytic calculations, and simple
Monte Carlo estimates. We show that our analysis naturally leads to a new and
improved workflow for validated variational inference. Finally, we demonstrate
the utility of our proposed workflow and error bounds on a robust regression
problem and on a real-data example with a widely used multilevel hierarchical
model.
@article{huggins2019validated,
abstract = {Variational inference has become an increasingly attractive fast alternative
to Markov chain Monte Carlo methods for approximate Bayesian inference.
However, a major obstacle to the widespread use of variational methods is the
lack of post-hoc accuracy measures that are both theoretically justified and
computationally efficient. In this paper, we provide rigorous bounds on the
error of posterior mean and uncertainty estimates that arise from
full-distribution approximations, as in variational inference. Our bounds are
widely applicable, as they require only that the approximating and exact
posteriors have polynomial moments. Our bounds are also computationally
efficient for variational inference because they require only standard values
from variational objectives, straightforward analytic calculations, and simple
Monte Carlo estimates. We show that our analysis naturally leads to a new and
improved workflow for validated variational inference. Finally, we demonstrate
the utility of our proposed workflow and error bounds on a robust regression
problem and on a real-data example with a widely used multilevel hierarchical
model.},
added-at = {2021-01-12T01:12:16.000+0100},
author = {Huggins, Jonathan H. and Kasprzak, Mikołaj and Campbell, Trevor and Broderick, Tamara},
biburl = {https://www.bibsonomy.org/bibtex/2d9eb52e28415ba3e79df871184596cbc/kirk86},
description = {[1910.04102] Validated Variational Inference via Practical Posterior Error Bounds},
interhash = {25e73e57afdddd9359f90b2273714f7d},
intrahash = {d9eb52e28415ba3e79df871184596cbc},
keywords = {bayesian bounds inference readings variational},
note = {cite arxiv:1910.04102Comment: A python package for carrying out our validated variational inference workflow -- including doing black-box variational inference and computing the bounds we develop in this paper -- is available at https://github.com/jhuggins/viabel. The same repository also contains code for reproducing all of our experiments},
timestamp = {2021-01-12T01:12:16.000+0100},
title = {Validated Variational Inference via Practical Posterior Error Bounds},
url = {http://arxiv.org/abs/1910.04102},
year = 2019
}