Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences
M. Kumar, and K. Mishra. (2020)cite arxiv:2001.04769Comment: 20 pages.
Abstract
We study the geometry of probability distributions with respect to a
generalized family of Csiszár $f$-divergences. A member of this family is the
relative $\alpha$-entropy which is also a Rényi analog of relative entropy in
information theory and known as logarithmic or projective power divergence in
statistics. We apply Eguchi's theory to derive the Fisher information metric
and the dual affine connections arising from these generalized divergence
functions. The notion enables us to arrive at a more widely applicable version
of the Cramér-Rao inequality, which provides a lower bound for the variance
of an estimator for an escort of the underlying parametric probability
distribution. We then extend the Amari-Nagaoka's dually flat structure of the
exponential and mixer models to other distributions with respect to the
aforementioned generalized metric. We show that these formulations lead us to
find unbiased and efficient estimators for the escort model.
Description
[2001.04769] Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences
%0 Journal Article
%1 kumar2020cramerrao
%A Kumar, M. Ashok
%A Mishra, Kumar Vijay
%D 2020
%K bounds divergences readings
%T Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences
%U http://arxiv.org/abs/2001.04769
%X We study the geometry of probability distributions with respect to a
generalized family of Csiszár $f$-divergences. A member of this family is the
relative $\alpha$-entropy which is also a Rényi analog of relative entropy in
information theory and known as logarithmic or projective power divergence in
statistics. We apply Eguchi's theory to derive the Fisher information metric
and the dual affine connections arising from these generalized divergence
functions. The notion enables us to arrive at a more widely applicable version
of the Cramér-Rao inequality, which provides a lower bound for the variance
of an estimator for an escort of the underlying parametric probability
distribution. We then extend the Amari-Nagaoka's dually flat structure of the
exponential and mixer models to other distributions with respect to the
aforementioned generalized metric. We show that these formulations lead us to
find unbiased and efficient estimators for the escort model.
@article{kumar2020cramerrao,
abstract = {We study the geometry of probability distributions with respect to a
generalized family of Csisz\'ar $f$-divergences. A member of this family is the
relative $\alpha$-entropy which is also a R\'enyi analog of relative entropy in
information theory and known as logarithmic or projective power divergence in
statistics. We apply Eguchi's theory to derive the Fisher information metric
and the dual affine connections arising from these generalized divergence
functions. The notion enables us to arrive at a more widely applicable version
of the Cram\'{e}r-Rao inequality, which provides a lower bound for the variance
of an estimator for an escort of the underlying parametric probability
distribution. We then extend the Amari-Nagaoka's dually flat structure of the
exponential and mixer models to other distributions with respect to the
aforementioned generalized metric. We show that these formulations lead us to
find unbiased and efficient estimators for the escort model.},
added-at = {2020-01-16T00:21:06.000+0100},
author = {Kumar, M. Ashok and Mishra, Kumar Vijay},
biburl = {https://www.bibsonomy.org/bibtex/26a906ca38823292accb6ee244593f7ef/kirk86},
description = {[2001.04769] Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences},
interhash = {4cfd6c34b86ff7fae33b5d900adfb6ca},
intrahash = {6a906ca38823292accb6ee244593f7ef},
keywords = {bounds divergences readings},
note = {cite arxiv:2001.04769Comment: 20 pages},
timestamp = {2020-01-16T00:21:06.000+0100},
title = {Cram\'er-Rao Lower Bounds Arising from Generalized Csisz\'ar Divergences},
url = {http://arxiv.org/abs/2001.04769},
year = 2020
}