An entropy inequality for symmetric random variables
J. Hao, and V. Jog. (2018)cite arxiv:1801.03868Comment: submitted to ISIT 2018.
Abstract
We establish a lower bound on the entropy of weighted sums of (possibly
dependent) random variables $(X_1, X_2, \dots, X_n)$ possessing a symmetric
joint distribution. Our lower bound is in terms of the joint entropy of $(X_1,
X_2, \dots, X_n)$. We show that for $n 3$, the lower bound is tight if and
only if $X_i$'s are i.i.d.\ Gaussian random variables. For $n=2$ there are
numerous other cases of equality apart from i.i.d.\ Gaussians, which we
completely characterize. Going beyond sums, we also present an inequality for
certain linear transformations of $(X_1, \dots, X_n)$. Our primary technical
contribution lies in the analysis of the equality cases, and our approach
relies on the geometry and the symmetry of the problem.
Description
An entropy inequality for symmetric random variables
%0 Journal Article
%1 hao2018entropy
%A Hao, Jing
%A Jog, Varun
%D 2018
%K entropy-inequalities
%T An entropy inequality for symmetric random variables
%U http://arxiv.org/abs/1801.03868
%X We establish a lower bound on the entropy of weighted sums of (possibly
dependent) random variables $(X_1, X_2, \dots, X_n)$ possessing a symmetric
joint distribution. Our lower bound is in terms of the joint entropy of $(X_1,
X_2, \dots, X_n)$. We show that for $n 3$, the lower bound is tight if and
only if $X_i$'s are i.i.d.\ Gaussian random variables. For $n=2$ there are
numerous other cases of equality apart from i.i.d.\ Gaussians, which we
completely characterize. Going beyond sums, we also present an inequality for
certain linear transformations of $(X_1, \dots, X_n)$. Our primary technical
contribution lies in the analysis of the equality cases, and our approach
relies on the geometry and the symmetry of the problem.
@article{hao2018entropy,
abstract = {We establish a lower bound on the entropy of weighted sums of (possibly
dependent) random variables $(X_1, X_2, \dots, X_n)$ possessing a symmetric
joint distribution. Our lower bound is in terms of the joint entropy of $(X_1,
X_2, \dots, X_n)$. We show that for $n \geq 3$, the lower bound is tight if and
only if $X_i$'s are i.i.d.\ Gaussian random variables. For $n=2$ there are
numerous other cases of equality apart from i.i.d.\ Gaussians, which we
completely characterize. Going beyond sums, we also present an inequality for
certain linear transformations of $(X_1, \dots, X_n)$. Our primary technical
contribution lies in the analysis of the equality cases, and our approach
relies on the geometry and the symmetry of the problem.},
added-at = {2018-01-17T23:05:26.000+0100},
author = {Hao, Jing and Jog, Varun},
biburl = {https://www.bibsonomy.org/bibtex/2e76de3c6bdd0588ed6852043ee4a72e4/claired},
description = {An entropy inequality for symmetric random variables},
interhash = {92ca335173cfb299c48d21a2e77dc081},
intrahash = {e76de3c6bdd0588ed6852043ee4a72e4},
keywords = {entropy-inequalities},
note = {cite arxiv:1801.03868Comment: submitted to ISIT 2018},
timestamp = {2018-01-17T23:05:26.000+0100},
title = {An entropy inequality for symmetric random variables},
url = {http://arxiv.org/abs/1801.03868},
year = 2018
}