@krassi

Entropies and cross-entropies of exponential families

, and . 2010 IEEE International Conference on Image Processing, page 3621-3624. (September 2010)
DOI: 10.1109/ICIP.2010.5652054

Abstract

Statistical modeling of images plays a crucial role in modern image processing tasks like segmentation, object detection and restoration. Although Gaussian distributions are conveniently handled mathematically, the role of many other types of distributions has been revealed and emphasized by natural image statistics. In this paper, we consider a versatile class of distributions called exponential families that encompasses many well-known distributions, such as Gaussian, Poisson, multinomial, Gamma/Beta and Dirichlet distributions, just to name a few. For those families, we derive mathematical expressions for their Shannon entropy and cross-entropy, give a geometric interpretation, and show that they admit closed-form formula up to some entropic normalizing constant depending on the carrier measure but independent of the member of the family. This allows one to design algorithms that can compare exactly entropies and cross-entropies of exponential family distributions although some of them have strictus sensus no known closed forms (eg., Poisson). We discuss about maximum entropy and touch upon the entropy of mixtures of exponential families for which we provide a relative entropy upper bound.

Links and resources

Tags

community

  • @krassi
  • @dblp
@krassi's tags highlighted