posted on 2010-01-15, 10:02authored byJonathon Chambers, Warren Sherliker, Danilo P. Mandic
A normalized algorithm for on-line adaptation of a recurrent perceptron is derived. The algorithm builds upon the normalized backpropagation (NBP) algorithm for feedforward neural networks, and provides an adaptive learning rate and normalization for a recurrent perceptron learning algorithm. The algorithm is based upon local linearization about the current point in the state-space of the network. Such a learning rate is normalized by the squared norm of the gradient at the neuron, which extends the notion of normalized linear algorithms to the nonlinear case
History
School
Mechanical, Electrical and Manufacturing Engineering
Citation
CHAMBERS, J.A., SHERLIKER, W. and MANDIC, D.P., 2000. A normalized gradient algorithm for an adaptive recurrent perceptron. IN: IEEE International Conference on Acoustics, Speech, and Signal Processing, (ICASSP '00), Istanbul, 5-9 June, Vol. 1, pp. 396-399