Article,

Discrete Neural Processes

, , , , and .
(2018)cite arxiv:1901.00409.

Abstract

Many data generating processes involve latent random variables over discrete combinatorial spaces whose size grows factorially with the dataset. In these settings, existing posterior inference methods can be inaccurate and/or very slow. In this work we develop methods for efficient amortized approximate Bayesian inference over discrete combinatorial spaces, with applications to probabilistic clustering (such as Dirichlet process mixture models),random communities (such as stochastic block models) and random permutations. The approach exploits the exchangeability of the generative models and is based on mapping distributed, permutation-invariant representations of discrete arrangements into conditional probabilities. The resulting algorithms parallelize easily, yield iid samples from the approximate posteriors along with a probability estimate of each sample (a quantity generally unavailable using Markov Chain Monte Carlo) and can easily be applied to both conjugate and non-conjugate models, as training only requires samples from the generative model. As a scientific application, we present a novel approach to spike sorting for high-density multielectrode probes.

Tags

Users

  • @kirk86

Comments and Reviews