Abstract
Slow feature analysis (SFA) is a new unsupervised algorithm to learn nonlinear functions that ex tract slowly varying signals out of the input data. In this paper we describe its application to pattern
recognition. In this context in order to be slowly varying the functions learned by SFA need to respond
similarly to the patterns belonging to the same class. We prove that, given input patterns belonging to C
non-overlapping classes and a large enough function space, the optimal solution consists of C − 1 output
signals that are constant for each individual class. As a consequence, their output provides a feature
space suitable to perform classification with simple methods, such as Gaussian classifiers. We then show
as an example the application of SFA to the MNIST handwritten digits database. The performance of
SFA is comparable to that of other established algorithms. Finally, we suggest some possible extensions
to the proposed method. Our approach is in particular attractive because for a given input signal and
a fixed function space it has no parameters, it is easy to implement and apply, and it has low memory
requirements and high speed during recognition. SFA finds the global solution (within the considered
function space) in a single iteration without convergence issues. Moreover, the proposed method is
completely problem-independent.
Users
Please
log in to take part in the discussion (add own reviews or comments).