Article,

Implementing Linear Models in Genetic Programming

, , , and .
IEEE Transactions on Evolutionary Computation, 8 (6): 542--566 (December 2004)
DOI: doi:10.1109/TEVC.2004.836818

Abstract

We deal with linear models of genetic programming (GP) for regression or approximation problems when given learning samples are not sufficient. The linear model, which is a function of unknown parameters, is built through extracting all possible base functions from the standard GP tree by a symbolic processing algorithm. The major advantage of a linear model in GP is that its parameters can be estimated by the ordinary least square (OLS) method and a good model can be selected by applying the modern minimum description length (MDL) principle, while the nonlinearity necessary to handle the given problem is effectively maintained by indirectly evolving and finding various forms of base functions. In addition to a standard linear model consisting of mathematical functions, one variant of a linear model, which can be built using low-order Taylor series and can be converted into the standard form of a polynomial, is considered in this paper. With small samples, GP frequently shows the abnormal behaviors such as extreme large peaks or odd-looking discontinuities at the points away from sample points. To overcome this problem, a directional derivative-based smoothing (DDBS) method, which is incorporated into the OLS method, is introduced together with the fitness function that is based on MDL, reflecting the effects of DDBS. Also, two illustrative examples and three engineering applications are presented.

Tags

Users

  • @brazovayeye

Comments and Reviews