Inproceedings,

Lifted Conditioning for Pairwise Marginals and Beyond

, , and .
Proceedings of LWA2010 - Workshop-Woche: Lernen, Wissen & Adaptivitaet, Kassel, Germany, (2010)

Abstract

Lifted belief propagation (LBP) can be extremely fast at computing approximate marginal probability distributions over single ground atoms and neighboring ones in the underlying graphical model. It does, however, not prescribe a way to compute joint distributions over pairs, triples or k-tuples of distant ground atoms. In this paper, we present an algorithm, called conditioned LBP, for approximating these distributions. Essentially, we select variables one at a time for conditioning, running lifted belief propagation after each selection. This naive solution, however, recomputes the lifted network in each step from scratch, therefore often canceling the benefits of lifted inference. We show how to avoid this by efficiently computing the lifted network for each conditioning directly from the one already known for the single node marginals. This contribution advances the theoretical understanding of lifted inference but also allows one to efficiently solve many important AI tasks such as finding the MAP assignment, sequential forward sampling, parameter estimation, active learning, sensitivity analysis, to name only few.

Tags

Users

  • @djain
  • @lwa2010

Comments and Reviews