@casi

Transfer learning for image classification with sparse prototype representations

, , and . Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on, page 1-8. (June 2008)
DOI: 10.1109/CVPR.2008.4587637

Abstract

To learn a new visual category from few examples, prior knowledge from unlabeled data as well as previous related categories may be useful. We develop a new method for transfer learning which exploits available unlabeled data and an arbitrary kernel function; we form a representation based on kernel distances to a large set of unlabeled data points. To transfer knowledge from previous related problems we observe that a category might be learnable using only a small subset of reference prototypes. Related problems may share a significant number of relevant prototypes; we find such a concise representation by performing a joint loss minimization over the training sets of related problems with a shared regularization penalty that minimizes the total number of prototypes involved in the approximation. This optimization problem can be formulated as a linear program that can be solved efficiently. We conduct experiments on a news-topic prediction task where the goal is to predict whether an image belongs to a particular news topic. Our results show that when only few examples are available for training a target topic, leveraging knowledge learnt from other topics can significantly improve performance.

Description

Training with unlabelled data.

Links and resources

Tags

community

  • @casi
  • @dblp
@casi's tags highlighted