@peter.ralph

Expectation Propagation for approximate Bayesian inference

. page 362--369. San Francisco, Morgan Kaufmann, (2001)

Abstract

This paper presents a new deterministic approx imation technique in Bayesian networks. This method, "Expectation Propagation," unifies two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy be lief propagation, an extension of belief propaga tion in Bayesian networks. Loopy belief propa gation, because it propagates exact belief states, is useful for a limited class of belief networks, such as those which are purely discrete. Expec tation Propagation approximates the belief states by only retaining expectations, such as mean and variance, and iterates until these expectations are consistent throughout the network. This makes it applicable to hybrid networks with discrete and continuous nodes. Experiments with Gaussian mixture models show Expectation Propagation to be convincingly better than methods with simi lar computational cost: Laplace's method, vari ational Bayes, and Monte Carlo. Expectation Propagation also provides an efficient algorithm for training Bayes point machine classifiers.

Links and resources

Tags

community