Techreport,

Synthesis of Neural Networks: the Case of Cascaded Hebbians

.
96-102. Research Group on Artificial Intelligence, JATE-MTA, Szeged 6720, Aradi vrt tere 1., HUNGARY, (August 1996)

Abstract

We show that cascading Hebbian learning with any other convergent algorithm (called the forward algorithm) results in the convergence of the Hebbian weights to a stationary point where the Hebbian algorithm would converge if the weights of the forward algorithm had already converged. Further, it is shown that the convergence rate of the composite algorithm does not deteriorate because of the cascading. This result is a consequence of a more general theorem which is also stated and proved here, the proofs being based on a global Lipschitzian assumption. The theory is illustrated by a composite PCA-Hebbian architecture introduced by Micheals (Michaels, 1995).

Tags

Users

  • @csaba

Comments and Reviews