Abstract
In this paper we give a classifier design approach, which yields classifiers
of high computational efficiency and low memory requirements. The
method is based on subdivision of the pattern space by a minimum
number of optimized linear inequalities, i.e. the discriminators,
into regions containing the pattern classes. The discrimination vectors
are selected from the principal axes of each pattern class covariance
matrix. We present an optimization procedure for sets of discrete
parameter bounds. We briefly discuss the problem of multimodal pattern
classes, and the possibility of using suboptimum solutions. The main
objective is to design a group of optimum classifiers for payphone
coin classification. In two examples we compare our classifier performance
with the classification performance derived from fitting normal probability
densities to the pattern classes.
Users
Please
log in to take part in the discussion (add own reviews or comments).