Misc,

Deep Forest: Towards An Alternative to Deep Neural Networks.

, and .
(2017)cite arxiv:1702.08835Comment: 7 pages, 5 figures.

Abstract

In this paper, we propose gcForest, a decision tree ensemble approach with performance highly competitive to deep neural networks. In contrast to deep neural networks which require great effort in hyper-parameter tuning, gcForest is much easier to train. Actually, even when gcForest is applied to different data from different domains, excellent performance can be achieved by almost same settings of hyper-parameters. The training process of gcForest is efficient and scalable. In our experiments its training time running on a PC is comparable to that of deep neural networks running with GPU facilities, and the efficiency advantage may be more apparent because gcForest is naturally apt to parallel implementation. Furthermore, in contrast to deep neural networks which require large-scale training data, gcForest can work well even when there are only small-scale training data. Moreover, as a tree-based approach, gcForest should be easier for theoretical analysis than deep neural networks.

Tags

Users

  • @marcsaric
  • @dblp

Comments and Reviewsshow / hide

  • @giannis81
    7 years ago (last updated 7 years ago)
    Interesting
  • @giannis81
    @giannis81 7 years ago
    Interesting I d say...
  • @marcsaric
    7 years ago (last updated 7 years ago)
Please log in to take part in the discussion (add own reviews or comments).