@hotho

DeepSentiBank: Visual Sentiment Concept Classification with Deep Convolutional Neural Networks

, , , and . (2014)cite arxiv:1410.8586Comment: 7 pages, 4 figures.

Abstract

This paper introduces a visual sentiment concept classification method based on deep convolutional neural networks (CNNs). The visual sentiment concepts are adjective noun pairs (ANPs) automatically discovered from the tags of web photos, and can be utilized as effective statistical cues for detecting emotions depicted in the images. Nearly one million Flickr images tagged with these ANPs are downloaded to train the classifiers of the concepts. We adopt the popular model of deep convolutional neural networks which recently shows great performance improvement on classifying large-scale web-based image dataset such as ImageNet. Our deep CNNs model is trained based on Caffe, a newly developed deep learning framework. To deal with the biased training data which only contains images with strong sentiment and to prevent overfitting, we initialize the model with the model weights trained from ImageNet. Performance evaluation shows the newly trained deep CNNs model SentiBank 2.0 (or called DeepSentiBank) is significantly improved in both annotation accuracy and retrieval performance, compared to its predecessors which mainly use binary SVM classification models.

Description

DeepSentiBank: Visual Sentiment Concept Classification with Deep Convolutional Neural Networks

Links and resources

Tags

community

  • @kde-alumni
  • @hotho
  • @dblp
@hotho's tags highlighted