Abstract
Genetic programming has been successfully applied to
evolve computer programs for solving a variety of
interesting problems. In the previous work we
introduced the breeder genetic programming (BGP) method
that has Occam's razor in its fitness measure to evolve
minimal size multilayer perceptrons. In this paper we
apply the method to synthesis of sigma-pi neural
networks. Unlike perceptron architectures, sigma-pi
networks use product units as well as summation units
to build higher-order terms. The effectiveness of the
method is demonstrated on benchmark problems.
Simulation results on noisy data suggest that BGP not
only improves the generalization performance, it can
also accelerate the convergence speed.
Users
Please
log in to take part in the discussion (add own reviews or comments).