Expectation Propagation for approximate Bayesian inference
T. Minka. page 362--369. San Francisco, Morgan Kaufmann, (2001)
Abstract
This paper presents a new deterministic approx
imation technique in Bayesian networks. This
method, "Expectation Propagation," unifies two
previous techniques: assumed-density filtering,
an extension of the Kalman filter, and loopy be
lief propagation, an extension of belief propaga
tion in Bayesian networks. Loopy belief propa
gation, because it propagates exact belief states,
is useful for a limited class of belief networks,
such as those which are purely discrete. Expec
tation Propagation approximates the belief states
by only retaining expectations, such as mean and
variance, and iterates until these expectations are
consistent throughout the network. This makes it
applicable to hybrid networks with discrete and
continuous nodes. Experiments with Gaussian
mixture models show Expectation Propagation to
be convincingly better than methods with simi
lar computational cost: Laplace's method, vari
ational Bayes, and Monte Carlo. Expectation
Propagation also provides an efficient algorithm
for training Bayes point machine classifiers.
%0 Conference Paper
%1 minka2013expectation
%A Minka, Thomas P.
%C San Francisco
%D 2001
%E Breese, J.
%E Koller, D.
%I Morgan Kaufmann
%J Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence (UAI)
%K Bayesian_networks expectation_propagation exponential_family statistics
%P 362--369
%T Expectation Propagation for approximate Bayesian inference
%U https://arxiv.org/abs/1301.2294
%X This paper presents a new deterministic approx
imation technique in Bayesian networks. This
method, "Expectation Propagation," unifies two
previous techniques: assumed-density filtering,
an extension of the Kalman filter, and loopy be
lief propagation, an extension of belief propaga
tion in Bayesian networks. Loopy belief propa
gation, because it propagates exact belief states,
is useful for a limited class of belief networks,
such as those which are purely discrete. Expec
tation Propagation approximates the belief states
by only retaining expectations, such as mean and
variance, and iterates until these expectations are
consistent throughout the network. This makes it
applicable to hybrid networks with discrete and
continuous nodes. Experiments with Gaussian
mixture models show Expectation Propagation to
be convincingly better than methods with simi
lar computational cost: Laplace's method, vari
ational Bayes, and Monte Carlo. Expectation
Propagation also provides an efficient algorithm
for training Bayes point machine classifiers.
@inproceedings{minka2013expectation,
abstract = {This paper presents a new deterministic approx
imation technique in Bayesian networks. This
method, "Expectation Propagation," unifies two
previous techniques: assumed-density filtering,
an extension of the Kalman filter, and loopy be
lief propagation, an extension of belief propaga
tion in Bayesian networks. Loopy belief propa
gation, because it propagates exact belief states,
is useful for a limited class of belief networks,
such as those which are purely discrete. Expec
tation Propagation approximates the belief states
by only retaining expectations, such as mean and
variance, and iterates until these expectations are
consistent throughout the network. This makes it
applicable to hybrid networks with discrete and
continuous nodes. Experiments with Gaussian
mixture models show Expectation Propagation to
be convincingly better than methods with simi
lar computational cost: Laplace's method, vari
ational Bayes, and Monte Carlo. Expectation
Propagation also provides an efficient algorithm
for training Bayes point machine classifiers.},
added-at = {2024-12-02T06:41:30.000+0100},
address = {San Francisco},
archiveprefix = {arXiv},
author = {Minka, Thomas P.},
biburl = {https://www.bibsonomy.org/bibtex/23beea97309c60356868b8f907d819ecd/peter.ralph},
editor = {Breese, J. and Koller, D.},
eprint = {1301.2294},
interhash = {5d4d5faf60b5f16924ddac69c11f2e96},
intrahash = {3beea97309c60356868b8f907d819ecd},
journal = {Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence (UAI)},
keywords = {Bayesian_networks expectation_propagation exponential_family statistics},
pages = {362--369},
primaryclass = {cs.AI},
publisher = {Morgan Kaufmann},
timestamp = {2024-12-02T06:50:27.000+0100},
title = {Expectation {Propagation} for approximate {Bayesian} inference},
url = {https://arxiv.org/abs/1301.2294},
year = 2001
}