On $w$-mixtures: Finite convex combinations of prescribed component
distributions
F. Nielsen, and R. Nock. (2017)cite arxiv:1708.00568Comment: 25 pages.
Abstract
We consider the space of $w$-mixtures that is the set of finite statistical
mixtures sharing the same prescribed component distributions. The geometry
induced by the Kullback-Leibler (KL) divergence on this family of $w$-mixtures
is a dually flat space in information geometry called the mixture family
manifold. It follows that the KL divergence between two $w$-mixtures is
equivalent to a Bregman Divergence (BD) defined for the negative Shannon
entropy generator. Thus the KL divergence between two Gaussian Mixture Models
(GMMs) sharing the same components is (theoretically) a Bregman divergence.
This KL-BD equivalence implies that we can perform optimal KL-averaging
aggregation of $w$-mixtures without information loss. More generally, we prove
that the skew Jensen-Shannon divergence between $w$-mixtures is equivalent to a
skew Jensen divergence on their parameters. Finally, we state several
divergence identity and inequalities relating $w$-mixtures.
Description
[1708.00568] On $w$-mixtures: Finite convex combinations of prescribed component distributions
%0 Journal Article
%1 nielsen2017wmixtures
%A Nielsen, Frank
%A Nock, Richard
%D 2017
%K divergences entropy information readings theory
%T On $w$-mixtures: Finite convex combinations of prescribed component
distributions
%U http://arxiv.org/abs/1708.00568
%X We consider the space of $w$-mixtures that is the set of finite statistical
mixtures sharing the same prescribed component distributions. The geometry
induced by the Kullback-Leibler (KL) divergence on this family of $w$-mixtures
is a dually flat space in information geometry called the mixture family
manifold. It follows that the KL divergence between two $w$-mixtures is
equivalent to a Bregman Divergence (BD) defined for the negative Shannon
entropy generator. Thus the KL divergence between two Gaussian Mixture Models
(GMMs) sharing the same components is (theoretically) a Bregman divergence.
This KL-BD equivalence implies that we can perform optimal KL-averaging
aggregation of $w$-mixtures without information loss. More generally, we prove
that the skew Jensen-Shannon divergence between $w$-mixtures is equivalent to a
skew Jensen divergence on their parameters. Finally, we state several
divergence identity and inequalities relating $w$-mixtures.
@article{nielsen2017wmixtures,
abstract = {We consider the space of $w$-mixtures that is the set of finite statistical
mixtures sharing the same prescribed component distributions. The geometry
induced by the Kullback-Leibler (KL) divergence on this family of $w$-mixtures
is a dually flat space in information geometry called the mixture family
manifold. It follows that the KL divergence between two $w$-mixtures is
equivalent to a Bregman Divergence (BD) defined for the negative Shannon
entropy generator. Thus the KL divergence between two Gaussian Mixture Models
(GMMs) sharing the same components is (theoretically) a Bregman divergence.
This KL-BD equivalence implies that we can perform optimal KL-averaging
aggregation of $w$-mixtures without information loss. More generally, we prove
that the skew Jensen-Shannon divergence between $w$-mixtures is equivalent to a
skew Jensen divergence on their parameters. Finally, we state several
divergence identity and inequalities relating $w$-mixtures.},
added-at = {2019-12-11T14:27:59.000+0100},
author = {Nielsen, Frank and Nock, Richard},
biburl = {https://www.bibsonomy.org/bibtex/2f6e4dd0e28b1f800a87d4123e3675b3e/kirk86},
description = {[1708.00568] On $w$-mixtures: Finite convex combinations of prescribed component distributions},
interhash = {a63e35531074c0dd6177d0d5479469d4},
intrahash = {f6e4dd0e28b1f800a87d4123e3675b3e},
keywords = {divergences entropy information readings theory},
note = {cite arxiv:1708.00568Comment: 25 pages},
timestamp = {2019-12-11T14:29:51.000+0100},
title = {On $w$-mixtures: Finite convex combinations of prescribed component
distributions},
url = {http://arxiv.org/abs/1708.00568},
year = 2017
}