Abstract
Machine learning is increasingly targeting areas where input data cannot be
accurately described by a single vector, but can be modeled instead using the
more flexible concept of random vectors, namely probability measures or more
simply point clouds of varying cardinality. Using deep architectures on
measures poses, however, many challenging issues. Indeed, deep architectures
are originally designed to handle fixedlength vectors, or, using recursive
mechanisms, ordered sequences thereof. In sharp contrast, measures describe a
varying number of weighted observations with no particular order. We propose in
this work a deep framework designed to handle crucial aspects of measures,
namely permutation invariances, variations in weights and cardinality.
Architectures derived from this pipeline can (i) map measures to measures -
using the concept of push-forward operators; (ii) bridge the gap between
measures and Euclidean spaces - through integration steps. This allows to
design discriminative networks (to classify or reduce the dimensionality of
input measures), generative architectures (to synthesize measures) and
recurrent pipelines (to predict measure dynamics). We provide a theoretical
analysis of these building blocks, review our architectures' approximation
abilities and robustness w.r.t. perturbation, and try them on various
discriminative and generative tasks.
Users
Please
log in to take part in the discussion (add own reviews or comments).