AI algorithms for fitting GARCH parameters to empirical financial data
We use Deep Artificial Neural Networks (ANNs) to estimate GARCH parameters for empirical financial time series. The algorithm we develop, allows us to fit autocovariance of squared returns of financial data, with certain time lags, the second order statistical moment and the fourth order standardised moment. We have compared the time taken for the ANN algorithm to predict parameters for many time windows (around 4000), to that of the time taken for the Maximum Likelihood Estimation (MLE) methods of MatLabs’s inbuilt statistical and econometric toolbox. The algorithm developed predicts all GARCH parameters in around 0.1 s, compared to the 11 seconds of the MLE method. Furthermore, we use a Model Confidence Set analysis to determine how accurate our parameter prediction algorithm is, when predicting volatility. The volatility prediction of different securities obtained employing the ANN has an error of around 25%, compared to 40% for the MLE methods.
History
School
- Science
Department
- Physics
Published in
Physica A: Statistical Mechanics and its ApplicationsVolume
603Issue
2022Publisher
ElsevierVersion
- VoR (Version of Record)
Rights holder
© The Author(s)Publisher statement
This is an Open Access Article. It is published by Elsevier under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)Acceptance date
2022-06-27Publication date
2022-06-28Copyright date
2022ISSN
0378-4371Publisher version
Language
- en