Article,

Approximate Bayesian inference for hierarchical Gaussian Markov random field models

, and .
Journal of Statistical Planning and Inference, 137 (10): 3177--3192 (October 2007)
DOI: 10.1016/j.jspi.2006.07.016

Abstract

Many commonly used models in statistics can be formulated as (Bayesian) hierarchical Gaussian Markov random field (GMRF) models. These are characterised by assuming a (often large) GMRF as the second stage in the hierarchical structure and a few hyperparameters at the third stage. Markov chain Monte Carlo (MCMC) is the common approach for Bayesian inference in such models. The variance of the Monte Carlo estimates is O p ( M - 1 / 2 ) where M is the number of samples in the chain so, in order to obtain precise estimates of marginal densities, say, we need M to be very large. Inspired by the fact that often one-block and independence samplers can be constructed for hierarchical GMRF-models, we will in this work investigate whether MCMC is really needed to estimate marginal densities, which often is the goal of the analysis. By making use of GMRF-approximations, we show by typical examples that marginal densities can indeed be very precisely estimated by deterministic schemes. The methodological and practical consequence of these findings are indeed positive. We conjecture that for many hierarchical GMRF-models there is really no need for MCMC based inference to estimate marginal densities. Further, by making use of numerical methods for sparse matrices the computational costs of these deterministic schemes are nearly instant compared to the MCMC alternative. In particular, we discuss in detail the issue of computing marginal variances for GMRFs.

Tags

Users

  • @yourwelcome
  • @smicha

Comments and Reviews