Abstract

Divergences, also known as contrast functions, are distance-like quantities defined on manifolds of non-negative or probability measures and they arise in various theoretical and applied problems. Using ideas in optimal transport, we introduce and study a parameterized family of $L^(\alpha)$-divergences which includes the Bregman divergence corresponding to the Euclidean quadratic cost, and the $L$-divergence introduced by Pal and Wong in connection with portfolio theory and a logarithmic cost function. Using this unified framework which elucidates the arguments in our previous work, we prove that these divergences induce geometric structures that are dually projectively flat with constant curvatures, and the generalized Pythagorean theorem holds true. Conversely, we show that if a statistical manifold is dually projectively flat with constant curvature $\alpha$ with $> 0$, then it is locally induced by an $L^(\alpha)$-divergence. We define in this context a canonical divergence which extends the one for dually flat manifolds. Finally, we study generalizations of exponential family and show that the $L^(\pm \alpha)$-divergence of the corresponding potential functions gives the Rényi divergence.

Description

Statistical manifolds from optimal transport

Links and resources

Tags