Don't Blame the ELBO! A Linear VAE Perspective on Posterior Collapse
J. Lucas, G. Tucker, R. Grosse, and M. Norouzi. (2019)cite arxiv:1911.02469Comment: 11 main pages, 10 appendix pages. 13 figures total. Accepted at 33rd Conference on Neural Information Processing Systems (NeurIPS 2019).
Abstract
Posterior collapse in Variational Autoencoders (VAEs) arises when the
variational posterior distribution closely matches the prior for a subset of
latent variables. This paper presents a simple and intuitive explanation for
posterior collapse through the analysis of linear VAEs and their direct
correspondence with Probabilistic PCA (pPCA). We explain how posterior collapse
may occur in pPCA due to local maxima in the log marginal likelihood.
Unexpectedly, we prove that the ELBO objective for the linear VAE does not
introduce additional spurious local maxima relative to log marginal likelihood.
We show further that training a linear VAE with exact variational inference
recovers an identifiable global maximum corresponding to the principal
component directions. Empirically, we find that our linear analysis is
predictive even for high-capacity, non-linear VAEs and helps explain the
relationship between the observation noise, local maxima, and posterior
collapse in deep Gaussian VAEs.
Description
[1911.02469] Don't Blame the ELBO! A Linear VAE Perspective on Posterior Collapse
cite arxiv:1911.02469Comment: 11 main pages, 10 appendix pages. 13 figures total. Accepted at 33rd Conference on Neural Information Processing Systems (NeurIPS 2019)
%0 Journal Article
%1 lucas2019blame
%A Lucas, James
%A Tucker, George
%A Grosse, Roger
%A Norouzi, Mohammad
%D 2019
%K bayesian readings uncertainty
%T Don't Blame the ELBO! A Linear VAE Perspective on Posterior Collapse
%U http://arxiv.org/abs/1911.02469
%X Posterior collapse in Variational Autoencoders (VAEs) arises when the
variational posterior distribution closely matches the prior for a subset of
latent variables. This paper presents a simple and intuitive explanation for
posterior collapse through the analysis of linear VAEs and their direct
correspondence with Probabilistic PCA (pPCA). We explain how posterior collapse
may occur in pPCA due to local maxima in the log marginal likelihood.
Unexpectedly, we prove that the ELBO objective for the linear VAE does not
introduce additional spurious local maxima relative to log marginal likelihood.
We show further that training a linear VAE with exact variational inference
recovers an identifiable global maximum corresponding to the principal
component directions. Empirically, we find that our linear analysis is
predictive even for high-capacity, non-linear VAEs and helps explain the
relationship between the observation noise, local maxima, and posterior
collapse in deep Gaussian VAEs.
@article{lucas2019blame,
abstract = {Posterior collapse in Variational Autoencoders (VAEs) arises when the
variational posterior distribution closely matches the prior for a subset of
latent variables. This paper presents a simple and intuitive explanation for
posterior collapse through the analysis of linear VAEs and their direct
correspondence with Probabilistic PCA (pPCA). We explain how posterior collapse
may occur in pPCA due to local maxima in the log marginal likelihood.
Unexpectedly, we prove that the ELBO objective for the linear VAE does not
introduce additional spurious local maxima relative to log marginal likelihood.
We show further that training a linear VAE with exact variational inference
recovers an identifiable global maximum corresponding to the principal
component directions. Empirically, we find that our linear analysis is
predictive even for high-capacity, non-linear VAEs and helps explain the
relationship between the observation noise, local maxima, and posterior
collapse in deep Gaussian VAEs.},
added-at = {2019-12-19T00:22:32.000+0100},
author = {Lucas, James and Tucker, George and Grosse, Roger and Norouzi, Mohammad},
biburl = {https://www.bibsonomy.org/bibtex/2f8b9da98f822ad947c9e447004abae9f/kirk86},
description = {[1911.02469] Don't Blame the ELBO! A Linear VAE Perspective on Posterior Collapse},
interhash = {4700607448c23d9abeb9cb387c852f93},
intrahash = {f8b9da98f822ad947c9e447004abae9f},
keywords = {bayesian readings uncertainty},
note = {cite arxiv:1911.02469Comment: 11 main pages, 10 appendix pages. 13 figures total. Accepted at 33rd Conference on Neural Information Processing Systems (NeurIPS 2019)},
timestamp = {2019-12-19T00:22:47.000+0100},
title = {Don't Blame the ELBO! A Linear VAE Perspective on Posterior Collapse},
url = {http://arxiv.org/abs/1911.02469},
year = 2019
}