A Comparative Evaluation of Voting and Meta-learning on Partitioned
Data
P. Chan, and S. Stolfo. International Conference on Machine Learning, page 90-98. (1995)The paper introduces a meta-learning scheme that is denotes by en
arbiter, which is a scheme for creating a tree that eventually comes
out with a prediction. This might be interesting in terms of exchanging
predictions in the classifier system..
Abstract
Much of the research in inductive learning concentrates on problems
with relatively small amounts of data. With the coming age of very
large network computing, it is likely that orders of magnitude more
data in databases will be available for various learning problems
of real world importance. Some learning algorithms assume that the
entire data set fits into main memory, which is not feasible for
massive amounts of data. One approach to handling a large data set
is to partition the data set into subsets, run the learning algorithm
on each of the subsets, and combine the results. In this paper we
evaluate different techniques for learning from partitioned data.
Our meta-learning approach is empirically compared with techniques
in the literature that aim to combine multiple evidence to arrive
at one prediction.
The paper introduces a meta-learning scheme that is denotes by en
arbiter, which is a scheme for creating a tree that eventually comes
out with a prediction. This might be interesting in terms of exchanging
predictions in the classifier system.
%0 Conference Paper
%1 chan95comparative
%A Chan, Philip K.
%A Stolfo, Salvatore J.
%B International Conference on Machine Learning
%D 1995
%K imported
%P 90-98
%T A Comparative Evaluation of Voting and Meta-learning on Partitioned
Data
%U citeseer.nj.nec.com/chan95comparative.html
%X Much of the research in inductive learning concentrates on problems
with relatively small amounts of data. With the coming age of very
large network computing, it is likely that orders of magnitude more
data in databases will be available for various learning problems
of real world importance. Some learning algorithms assume that the
entire data set fits into main memory, which is not feasible for
massive amounts of data. One approach to handling a large data set
is to partition the data set into subsets, run the learning algorithm
on each of the subsets, and combine the results. In this paper we
evaluate different techniques for learning from partitioned data.
Our meta-learning approach is empirically compared with techniques
in the literature that aim to combine multiple evidence to arrive
at one prediction.
@inproceedings{chan95comparative,
abstract = {Much of the research in inductive learning concentrates on problems
with relatively small amounts of data. With the coming age of very
large network computing, it is likely that orders of magnitude more
data in databases will be available for various learning problems
of real world importance. Some learning algorithms assume that the
entire data set fits into main memory, which is not feasible for
massive amounts of data. One approach to handling a large data set
is to partition the data set into subsets, run the learning algorithm
on each of the subsets, and combine the results. In this paper we
evaluate different techniques for learning from partitioned data.
Our meta-learning approach is empirically compared with techniques
in the literature that aim to combine multiple evidence to arrive
at one prediction.},
added-at = {2008-04-30T12:59:47.000+0200},
author = {Chan, Philip K. and Stolfo, Salvatore J.},
biburl = {https://www.bibsonomy.org/bibtex/211abbc2f186d8e74f70aa7a09992c993/kdubiq},
booktitle = {International Conference on Machine Learning},
description = {KDubiq Blueprint},
groupsearch = {0},
interhash = {19202e27c7e9336380adf2b803b2d42f},
intrahash = {11abbc2f186d8e74f70aa7a09992c993},
keywords = {imported},
note = {The paper introduces a meta-learning scheme that is denotes by en
arbiter, which is a scheme for creating a tree that eventually comes
out with a prediction. This might be interesting in terms of exchanging
predictions in the classifier system.},
pages = {90-98},
timestamp = {2008-04-30T12:59:58.000+0200},
title = {A Comparative Evaluation of Voting and Meta-learning on Partitioned
Data},
url = {citeseer.nj.nec.com/chan95comparative.html},
year = 1995
}