posted on 2010-01-15, 11:04authored byDanilo P. Mandic, Jonathon Chambers
We address the choice of the coefficients in the cost
function of a modular nested recurrent neural-network (RNN)
architecture, known as the pipelined recurrent neural network
(PRNN). Such a network can cope with the problem of vanishing
gradient, experienced in prediction with RNN’s. Constraints on
the coefficients of the cost function, in the form of a vector norm,
are considered. Unlike the previous cost function for the PRNN,
which included a forgetting factor motivated by the recursive
least squares (RLS) strategy, the proposed forms of cost function
provide “forgetting” of the outputs of adjacent modules based
upon the network architecture. Such an approach takes into account
the number of modules in the PRNN, through the unit norm
constraint on the coefficients of the cost function of the PRNN.
This is shown to be particularly suitable, since due to inherent
nesting in the PRNN, every module gives its full contribution to the
learning process, whereas the unit norm constrained cost function
introduces a sense of forgetting in the memory management
of the PRNN. The PRNN based upon a modified cost function
outperforms existing PRNN schemes in the time series prediction
simulations presented.
History
School
Mechanical, Electrical and Manufacturing Engineering
Citation
MANDIC, D.P. and CHAMBERS, J.A., 2000. On the choice of parameters of the cost function in nested modular RNN's. IEEE Transactions on Neural Networks, 11(2), pp. 315 - 322