@kirk86

On How Well Generative Adversarial Networks Learn Densities: Nonparametric and Parametric Results

. (2018)cite arxiv:1811.03179Comment: 36 pages, 3 figures.

Abstract

We study in this paper the rate of convergence for learning distributions with the adversarial framework and Generative Adversarial Networks (GANs), which subsumes Wasserstein, Sobolev and MMD GANs as special cases. We study a wide range of parametric and nonparametric target distributions, under a collection of objective evaluation metrics. On the nonparametric end, we investigate the minimax optimal rates and fundamental difficulty of the density estimation under the adversarial framework. On the parametric end, we establish a theory for general neural network classes (including deep leaky ReLU as a special case), that characterizes the interplay on the choice of generator and discriminator. We investigate how to obtain a good statistical guarantee for GANs through the lens of regularization. We discover and isolate a new notion of regularization, called the generator/discriminator pair regularization, that sheds light on the advantage of GANs compared to classical parametric and nonparametric approaches for density estimation. We develop novel oracle inequalities as the main tools for analyzing GANs, which is of independent theoretical interest.

Description

[1811.03179] On How Well Generative Adversarial Networks Learn Densities: Nonparametric and Parametric Results

Links and resources

Tags

community

  • @kirk86
  • @dblp
@kirk86's tags highlighted