mandic4.pdf (196.75 kB)
Toward an optimal PRNN-based nonlinear predictor
journal contributionposted on 2010-01-15, 11:39 authored by Danilo P. Mandic, Jonathon Chambers
We present an approach for selecting optimal parameters for the pipelined recurrent neural network (PRNN) in the paradigm of nonlinear and nonstationary signal prediction. We consider the role of nesting, which is inherent to the PRNN architecture. The corresponding number of nested modules needed for a certain prediction task, and their contribution toward the final prediction gain give a thorough insight into the way the PRNN performs, and offers solutions for optimization of its parameters. In particular, nesting allows the forgetting factor in the cost function of the PRNN to exceed unity, hence it becomes an emphasis factor. This compensates for the small contribution of the distant modules to the prediction process, due to nesting, and helps to circumvent the problem of vanishing gradient, experienced in RNNs for prediction. The PRNN is shown to outperform the linear least mean square and recursive least squares predictors, as well as previously proposed PRNN schemes, at no expense of additional computational complexity
- Mechanical, Electrical and Manufacturing Engineering
CitationMANDIC, D.P. and CHAMBERS, J.A., 1999. Toward an optimal PRNN-based nonlinear predictor. IEEE Transactions on Neural Networks, 10(6), pp. 1435-1442
- VoR (Version of Record)
NotesThis article was published in the journal IEEE Transactions on Neural Networks [© IEEE] and is also available at: http://ieeexplore.ieee.org/. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.