Abstract
We introduce Projected Latent Markov Chain Monte Carlo (PL-MCMC), a technique
for sampling from the high-dimensional conditional distributions learned by a
normalizing flow. We prove that PL-MCMC asymptotically samples from the exact
conditional distributions associated with a normalizing flow. As a conditional
sampling method, PL-MCMC enables Monte Carlo Expectation Maximization (MC-EM)
training of normalizing flows from incomplete data. By providing experimental
results for a variety of data sets, we demonstrate the practicality and
effectiveness of PL-MCMC for missing data inference using normalizing flows.
Users
Please
log in to take part in the discussion (add own reviews or comments).