Investigation of computational intelligence methods in forecasting at financial markets
DOI:
https://doi.org/10.20535/SRIT.2308-8893.2023.3.04Keywords:
optimization, GMDH, hybrid GMDH-neo-fuzzy network, LSTM, short- and middle-term forecastingAbstract
The work considers intelligent methods for solving the problem of short- and middle-term forecasting in the financial sphere. LSTM DL networks, GMDH, and hybrid GMDH-neo-fuzzy networks were studied. Neo-fuzzy neurons were chosen as nodes of the hybrid network, which allows to reduce computational costs. The optimal network parameters were found. The synthesis of the optimal structure of hybrid networks was performed. Experimental studies of LSTM, GMDH, and hybrid GMDH-neo-fuzzy networks with optimal parameters for short- and middle-term forecasting have been conducted. The accuracy of the obtained experimental predictions is compared. The forecasting intervals for which the application of the researched artificial intelligence methods is the most expedient have been determined.
References
Peter J. Brockwell and Richard A. Davis, Introduction to time series and forecasting; 2nd ed. Springer, 2002, 429 p.
Robert H. Shumway and David S. Stoffer, Time Series Analysis and its Applications with R Examples; 4 th edition. Springer, 2017, 562 p.
A.G. Ivakhnenko, G.A. Ivakhnenko, and J.A. Mueller, “Self-organization of the neural networks with active neurons,” Pattern Recognition and Image Analysis, vol. 4, no. 2, pp. 177–188, 1994.
A.G. Ivakhnenko, D. Wuensch, and G.A. Ivakhnenko, “Inductive sorting-out GMDH algorithms with polynomial complexity for active neurons of neural networks,” Neural Networks, 2, pp. 1169–1173, 1999.
S.S. Haykin, Neural networks: a comprehensive foundation; 2nd ed. Upper Saddle River, N.J: Prentice Hall, 1999.
S. Ossovsky, Neural networks for information processing. M.: Finance and Statistics, 2002, 344 p.
F. Wang, “Neural Networks Genetic Algorithms and Fuzzy Logic for Forecasting,” Proc. Intern. Conf. Advanced Trading Technologies. New York, 1992, pp. 504–532.
T. Yamakawa, E. Uchino, T. Miki, and H. Kusanagi, “A neo-fuzzy neuron and its applications to system identification and prediction of the system behavior,” Proc. 2nd Intern. Conf. Fuzzy Logic and Neural Networks "LIZUKA-92", Lizuka, 1992, pp. 477–483. 9. I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. MIT PRESS, 2016. Available: http://www.deeplearningbook.org
Yuriy Zaychenko, Yevgeniy Bodyanskiy, Oleksii Tyshchenko, Olena Boiko, and Galib Hamidov, “Hybrid GMDH-neuro-fuzzy system and its training scheme,” Int. Journal Information theories and Applications, vol. 24, no. 2, pp. 156–172, 2018.
Yu. Zaychenko and Galib Hamidov, “The Hybrid Deep Learning GMDH-neo-fuzzy Neural Network and Its Applications,” Proceedings of 13-th IEEE Intern Conference Application of Information and Communication Technologies-AICT2019, 23–25 October 2019, Baku, pp. 72–77.
Evgeniy Bodyanskiy, Yuriy Zaychenko, Olena Boiko, Galib Hamidov, and Anna Zelikman, “Structure Optimization and Investigations of Hybrid GMDH-Neo-fuzzy Neural Networks in Forecasting Problems,” System Analysis & Intelligent Computing; eds. Michael Zgurovsky, Natalia Pankratova (Book Studies in Computational Intelligence, SCI), vol. 1022. Springer, 2022, pp. 209–228.
Yuriy Zaychenkoa, Helen Zaichenkoa, and Galib Hamidov, “Hybrid GMDH Deep Learning Networks – Analysis, Optimization and Applications in Forecasting at Financial Sphere,” System Research and Information Technologies, no. 1, pp. 73–86, 2022. doi: 10.20535/SRIT.2308-8893.2022.1.06.
S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Computation, vol. 9, pp. 1735–1780, 1997. doi: 10.1162/neco.1997.9.8.1735.
B. Hammer, “On the approximation capability of recurrent neural networks,” Neurocomputing, vol. 31, pp. 107–123, 1998. doi: 10.1016/S0925-2312(99)00174-5.
C. Olah, Understanding LSTM networks, 2020. Available: https://colah.github.io/posts/2015-08-Understanding-LSTMs/
A. Graves, “Generating sequences with recurrent neural networks,” CoRR, vol. abs/1308.0850, 2013. doi: 10.48550/arXiv.1308.0850.