CODA: Counting Objects via Scale-aware Adversarial Density Adaption
L. Wang, Y. Li, and X. Xue. (2019)cite arxiv:1903.10442Comment: Accepted to ICME2019.
Abstract
Recent advances in crowd counting have achieved promising results with
increasingly complex convolutional neural network designs. However, due to the
unpredictable domain shift, generalizing trained model to unseen scenarios is
often suboptimal. Inspired by the observation that density maps of different
scenarios share similar local structures, we propose a novel adversarial
learning approach in this paper, i.e., CODA (Counting Objects via
scale-aware adversarial Density Adaption). To deal with different object
scales and density distributions, we perform adversarial training with pyramid
patches of multi-scales from both source- and target-domain. Along with a
ranking constraint across levels of the pyramid input, consistent object counts
can be produced for different scales. Extensive experiments demonstrate that
our network produces much better results on unseen datasets compared with
existing counting adaption models. Notably, the performance of our CODA is
comparable with the state-of-the-art fully-supervised models that are trained
on the target dataset. Further analysis indicates that our density adaption
framework can effortlessly extend to scenarios with different objects.
The code is available at https://github.com/Willy0919/CODA.
Description
CODA: Counting Objects via Scale-aware Adversarial Density Adaption
%0 Generic
%1 wang2019counting
%A Wang, Li
%A Li, Yongbo
%A Xue, Xiangyang
%D 2019
%K adversarial counting domain\_adapt loss semisup
%T CODA: Counting Objects via Scale-aware Adversarial Density Adaption
%U http://arxiv.org/abs/1903.10442
%X Recent advances in crowd counting have achieved promising results with
increasingly complex convolutional neural network designs. However, due to the
unpredictable domain shift, generalizing trained model to unseen scenarios is
often suboptimal. Inspired by the observation that density maps of different
scenarios share similar local structures, we propose a novel adversarial
learning approach in this paper, i.e., CODA (Counting Objects via
scale-aware adversarial Density Adaption). To deal with different object
scales and density distributions, we perform adversarial training with pyramid
patches of multi-scales from both source- and target-domain. Along with a
ranking constraint across levels of the pyramid input, consistent object counts
can be produced for different scales. Extensive experiments demonstrate that
our network produces much better results on unseen datasets compared with
existing counting adaption models. Notably, the performance of our CODA is
comparable with the state-of-the-art fully-supervised models that are trained
on the target dataset. Further analysis indicates that our density adaption
framework can effortlessly extend to scenarios with different objects.
The code is available at https://github.com/Willy0919/CODA.
@misc{wang2019counting,
abstract = {Recent advances in crowd counting have achieved promising results with
increasingly complex convolutional neural network designs. However, due to the
unpredictable domain shift, generalizing trained model to unseen scenarios is
often suboptimal. Inspired by the observation that density maps of different
scenarios share similar local structures, we propose a novel adversarial
learning approach in this paper, i.e., CODA (\emph{Counting Objects via
scale-aware adversarial Density Adaption}). To deal with different object
scales and density distributions, we perform adversarial training with pyramid
patches of multi-scales from both source- and target-domain. Along with a
ranking constraint across levels of the pyramid input, consistent object counts
can be produced for different scales. Extensive experiments demonstrate that
our network produces much better results on unseen datasets compared with
existing counting adaption models. Notably, the performance of our CODA is
comparable with the state-of-the-art fully-supervised models that are trained
on the target dataset. Further analysis indicates that our density adaption
framework can effortlessly extend to scenarios with different objects.
\emph{The code is available at https://github.com/Willy0919/CODA.}},
added-at = {2019-04-16T16:43:01.000+0200},
author = {Wang, Li and Li, Yongbo and Xue, Xiangyang},
biburl = {https://www.bibsonomy.org/bibtex/237cb435a30add0adc7b7c022aeed79e6/nmatsuk},
description = {CODA: Counting Objects via Scale-aware Adversarial Density Adaption},
interhash = {5291269503d9c82e92f6398ee9b2ee30},
intrahash = {37cb435a30add0adc7b7c022aeed79e6},
keywords = {adversarial counting domain\_adapt loss semisup},
note = {cite arxiv:1903.10442Comment: Accepted to ICME2019},
timestamp = {2019-04-16T16:43:01.000+0200},
title = {CODA: Counting Objects via Scale-aware Adversarial Density Adaption},
url = {http://arxiv.org/abs/1903.10442},
year = 2019
}