This paper introduces a novel approach to texture synthesis based on
generative adversarial networks (GAN) (Goodfellow et al., 2014). We extend the
structure of the input noise distribution by constructing tensors with
different types of dimensions. We call this technique Periodic Spatial GAN
(PSGAN). The PSGAN has several novel abilities which surpass the current state
of the art in texture synthesis. First, we can learn multiple textures from
datasets of one or more complex large images. Second, we show that the image
generation with PSGANs has properties of a texture manifold: we can smoothly
interpolate between samples in the structured noise space and generate novel
samples, which lie perceptually between the textures of the original dataset.
In addition, we can also accurately learn periodical textures. We make multiple
experiments which show that PSGANs can flexibly handle diverse texture and
image data sources. Our method is highly scalable and it can generate output
images of arbitrary large size.
Description
Learning Texture Manifolds with the Periodic Spatial GAN
%0 Generic
%1 bergmann2017learning
%A Bergmann, Urs
%A Jetchev, Nikolay
%A Vollgraf, Roland
%D 2017
%K TexturSynthesis
%T Learning Texture Manifolds with the Periodic Spatial GAN
%U http://arxiv.org/abs/1705.06566
%X This paper introduces a novel approach to texture synthesis based on
generative adversarial networks (GAN) (Goodfellow et al., 2014). We extend the
structure of the input noise distribution by constructing tensors with
different types of dimensions. We call this technique Periodic Spatial GAN
(PSGAN). The PSGAN has several novel abilities which surpass the current state
of the art in texture synthesis. First, we can learn multiple textures from
datasets of one or more complex large images. Second, we show that the image
generation with PSGANs has properties of a texture manifold: we can smoothly
interpolate between samples in the structured noise space and generate novel
samples, which lie perceptually between the textures of the original dataset.
In addition, we can also accurately learn periodical textures. We make multiple
experiments which show that PSGANs can flexibly handle diverse texture and
image data sources. Our method is highly scalable and it can generate output
images of arbitrary large size.
@misc{bergmann2017learning,
abstract = {This paper introduces a novel approach to texture synthesis based on
generative adversarial networks (GAN) (Goodfellow et al., 2014). We extend the
structure of the input noise distribution by constructing tensors with
different types of dimensions. We call this technique Periodic Spatial GAN
(PSGAN). The PSGAN has several novel abilities which surpass the current state
of the art in texture synthesis. First, we can learn multiple textures from
datasets of one or more complex large images. Second, we show that the image
generation with PSGANs has properties of a texture manifold: we can smoothly
interpolate between samples in the structured noise space and generate novel
samples, which lie perceptually between the textures of the original dataset.
In addition, we can also accurately learn periodical textures. We make multiple
experiments which show that PSGANs can flexibly handle diverse texture and
image data sources. Our method is highly scalable and it can generate output
images of arbitrary large size.},
added-at = {2017-05-23T12:19:40.000+0200},
author = {Bergmann, Urs and Jetchev, Nikolay and Vollgraf, Roland},
biburl = {https://www.bibsonomy.org/bibtex/25764650c438d001d4ce3394e11a3e764/joepmoritz},
description = {Learning Texture Manifolds with the Periodic Spatial GAN},
interhash = {be638d23a8ca751f9ead2320fc09b9f9},
intrahash = {5764650c438d001d4ce3394e11a3e764},
keywords = {TexturSynthesis},
note = {cite arxiv:1705.06566},
timestamp = {2017-05-23T12:20:26.000+0200},
title = {Learning Texture Manifolds with the Periodic Spatial GAN},
url = {http://arxiv.org/abs/1705.06566},
year = 2017
}