Abstract
In this paper a new approach to the evolution of
neural networks is presented. A linear chromosome
combined with a grid-based representation of the
network, and a new crossover operator, allow the
evolution of the architecture and the weights
simultaneously. In our approach there is no need for a
separate weight optimization procedure and networks
with more than one type of activation function can be
evolved. A pruning strategy is also introduced, which
leads to the generation of solutions with varying
degrees of complexity. Results of the application of
the method to several binary classification problems
are reported.
Users
Please
log in to take part in the discussion (add own reviews or comments).