Аннотация
In an effort to improve the performance of deep neural networks in
data-scarce, non-i.i.d., or unsupervised settings, much recent research has
been devoted to encoding invariance under symmetry transformations into neural
network architectures. We treat the neural network input and output as random
variables, and consider group invariance from the perspective of probabilistic
symmetry. Drawing on tools from probability and statistics, we establish a link
between functional and probabilistic symmetry, and obtain generative functional
representations of joint and conditional probability distributions that are
invariant or equivariant under the action of a compact group. Those
representations completely characterize the structure of neural networks that
can be used to model such distributions and yield a general program for
constructing invariant stochastic or deterministic neural networks. We develop
the details of the general program for exchangeable sequences and arrays,
recovering a number of recent examples as special cases.
Пользователи данного ресурса
Пожалуйста,
войдите в систему, чтобы принять участие в дискуссии (добавить собственные рецензию, или комментарий)