@kirk86

Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences

, and . (2020)cite arxiv:2001.04769Comment: 20 pages.

Abstract

We study the geometry of probability distributions with respect to a generalized family of Csiszár $f$-divergences. A member of this family is the relative $\alpha$-entropy which is also a Rényi analog of relative entropy in information theory and known as logarithmic or projective power divergence in statistics. We apply Eguchi's theory to derive the Fisher information metric and the dual affine connections arising from these generalized divergence functions. The notion enables us to arrive at a more widely applicable version of the Cramér-Rao inequality, which provides a lower bound for the variance of an estimator for an escort of the underlying parametric probability distribution. We then extend the Amari-Nagaoka's dually flat structure of the exponential and mixer models to other distributions with respect to the aforementioned generalized metric. We show that these formulations lead us to find unbiased and efficient estimators for the escort model.

Description

[2001.04769] Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences

Links and resources

URL:
BibTeX key:
kumar2020cramerrao
search on:

Comments and Reviews  
(0)

There is no review or comment yet. You can write one!

Tags


Cite this publication