Abstract
Optimal transport (OT) distances are increasingly used as loss functions for
statistical inference, notably in the learning of generative models or
supervised learning. Yet, the behavior of minimum Wasserstein estimators is
poorly understood, notably in high-dimensional regimes or under model
misspecification. In this work we adopt the viewpoint of projection robust (PR)
OT, which seeks to maximize the OT cost between two measures by choosing a
$k$-dimensional subspace onto which they can be projected. Our first
contribution is to establish several fundamental statistical properties of PR
Wasserstein distances, complementing and improving previous literature that has
been restricted to one-dimensional and well-specified cases. Next, we propose
the integral PR Wasserstein (IPRW) distance as an alternative to the PRW
distance, by averaging rather than optimizing on subspaces. Our complexity
bounds can help explain why both PRW and IPRW distances outperform Wasserstein
distances empirically in high-dimensional inference tasks. Finally, we consider
parametric inference using the PRW distance. We provide an asymptotic guarantee
of two types of minimum PRW estimators and formulate a central limit theorem
for max-sliced Wasserstein estimator under model misspecification. To enable
our analysis on PRW with projection dimension larger than one, we devise a
novel combination of variational analysis and statistical theory.
Users
Please
log in to take part in the discussion (add own reviews or comments).