Abstract
This paper is concerned with the automatic induction
of parsimonious neural networks. In contrast to other
program induction situations, network induction entails
parametric learning as well as structural adaptation.
We present a novel representation scheme called neural
trees that allows efficient learning of both network
architectures and parameters by genetic search. A
hybrid evolutionary method is developed for neural tree
induction that combines genetic programming and the
breeder genetic algorithm under the unified framework
of the minimum description length principle. The method
is successfully applied to the induction of higher
order neural trees while still keeping the resulting
structures sparse to ensure good generalization
performance. Empirical results are provided on two
chaotic time series prediction problems of practical
interest.
Users
Please
log in to take part in the discussion (add own reviews or comments).