The fact that image datasets are often imbalanced poses an intense challenge
for deep learning techniques. In this paper, we propose a method to restore the
balance in imbalanced images, by coalescing two concurrent methods, generative
adversarial networks (GANs) and capsule network. In our model, generative and
discriminative networks play a novel competitive game, in which the generator
generates samples towards specific classes from multivariate probabilities
distribution. The discriminator of our model is designed in a way that while
recognizing the real and fake samples, it is also requires to assign classes to
the inputs. Since GAN approaches require fully observed data during training,
when the training samples are imbalanced, the approaches might generate similar
samples which leading to data overfitting. This problem is addressed by
providing all the available information from both the class components jointly
in the adversarial training. It improves learning from imbalanced data by
incorporating the majority distribution structure in the generation of new
minority samples. Furthermore, the generator is trained with feature matching
loss function to improve the training convergence. In addition, prevents
generation of outliers and does not affect majority class space. The
evaluations show the effectiveness of our proposed methodology; in particular,
the coalescing of capsule-GAN is effective at recognizing highly overlapping
classes with much fewer parameters compared with the convolutional-GAN.
Description
[2004.02182] Imbalanced Data Learning by Minority Class Augmentation using Capsule Adversarial Networks
%0 Journal Article
%1 shamsolmoali2020imbalanced
%A Shamsolmoali, Pourya
%A Zareapoor, Masoumeh
%A Shen, Linlin
%A Sadka, Abdul Hamid
%A Yang, Jie
%D 2020
%K adversarial augmentation generative-models
%T Imbalanced Data Learning by Minority Class Augmentation using Capsule
Adversarial Networks
%U http://arxiv.org/abs/2004.02182
%X The fact that image datasets are often imbalanced poses an intense challenge
for deep learning techniques. In this paper, we propose a method to restore the
balance in imbalanced images, by coalescing two concurrent methods, generative
adversarial networks (GANs) and capsule network. In our model, generative and
discriminative networks play a novel competitive game, in which the generator
generates samples towards specific classes from multivariate probabilities
distribution. The discriminator of our model is designed in a way that while
recognizing the real and fake samples, it is also requires to assign classes to
the inputs. Since GAN approaches require fully observed data during training,
when the training samples are imbalanced, the approaches might generate similar
samples which leading to data overfitting. This problem is addressed by
providing all the available information from both the class components jointly
in the adversarial training. It improves learning from imbalanced data by
incorporating the majority distribution structure in the generation of new
minority samples. Furthermore, the generator is trained with feature matching
loss function to improve the training convergence. In addition, prevents
generation of outliers and does not affect majority class space. The
evaluations show the effectiveness of our proposed methodology; in particular,
the coalescing of capsule-GAN is effective at recognizing highly overlapping
classes with much fewer parameters compared with the convolutional-GAN.
@article{shamsolmoali2020imbalanced,
abstract = {The fact that image datasets are often imbalanced poses an intense challenge
for deep learning techniques. In this paper, we propose a method to restore the
balance in imbalanced images, by coalescing two concurrent methods, generative
adversarial networks (GANs) and capsule network. In our model, generative and
discriminative networks play a novel competitive game, in which the generator
generates samples towards specific classes from multivariate probabilities
distribution. The discriminator of our model is designed in a way that while
recognizing the real and fake samples, it is also requires to assign classes to
the inputs. Since GAN approaches require fully observed data during training,
when the training samples are imbalanced, the approaches might generate similar
samples which leading to data overfitting. This problem is addressed by
providing all the available information from both the class components jointly
in the adversarial training. It improves learning from imbalanced data by
incorporating the majority distribution structure in the generation of new
minority samples. Furthermore, the generator is trained with feature matching
loss function to improve the training convergence. In addition, prevents
generation of outliers and does not affect majority class space. The
evaluations show the effectiveness of our proposed methodology; in particular,
the coalescing of capsule-GAN is effective at recognizing highly overlapping
classes with much fewer parameters compared with the convolutional-GAN.},
added-at = {2020-04-07T12:47:39.000+0200},
author = {Shamsolmoali, Pourya and Zareapoor, Masoumeh and Shen, Linlin and Sadka, Abdul Hamid and Yang, Jie},
biburl = {https://www.bibsonomy.org/bibtex/267838652ced34342851c05f2c0976e97/kirk86},
description = {[2004.02182] Imbalanced Data Learning by Minority Class Augmentation using Capsule Adversarial Networks},
interhash = {77c00d4f13576abea984e674335f7869},
intrahash = {67838652ced34342851c05f2c0976e97},
keywords = {adversarial augmentation generative-models},
note = {cite arxiv:2004.02182},
timestamp = {2020-04-07T12:47:39.000+0200},
title = {Imbalanced Data Learning by Minority Class Augmentation using Capsule
Adversarial Networks},
url = {http://arxiv.org/abs/2004.02182},
year = 2020
}