posted on 2010-01-15, 09:50authored byDanilo P. Mandic, Jonathon Chambers, Milorad M. Bozic
Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a recurrent neural network (RNN) are provided. Existence, convergence, and robustness of such a process are analyzed. This is undertaken based upon the contraction mapping theorem (CMT) and the corresponding fixed point iteration (FPI). Upper bounds for such a process are shown to be the conditions of convergence for a commonly analyzed RNN with a linear state dependence
History
School
Mechanical, Electrical and Manufacturing Engineering
Citation
MANDIC, D.P., CHAMBERS, J.A. and BOZIC, M.M., 2000. On global asymptotic stability of fully connected recurrent neural networks. IN: IEEE International Conference on Acoustics, Speech, and Signal Processing, (ICASSP '00), Istanbul, 5-9 June, Vol 6, pp. 3406 - 3409