Investigation of computational intelligence methods in forecasting problems at stock exchanges

Authors

  • Yuriy Zaychenko Educational and Scientific Complex “Institute for Applied System Analysis” of the National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv, Ukraine http://orcid.org/0000-0001-9662-3269
  • Galib Hamidov “Azershig” company, Baku, Azerbaijan
  • Aydin Gasanov National Pedagogical Dragomanov University, Kyiv, Ukraine http://orcid.org/0000-0002-5821-0751

DOI:

https://doi.org/10.20535/SRIT.2308-8893.2021.2.03

Keywords:

share prices forecasting, LSTM, GRU, RNN, GMDH

Abstract

In this paper, the forecasting problem of share prices at the New York Stock Exchange (NYSE) was considered and investigated. For its solution the alternative methods of computational intelligence were suggested and investigated: LSTM networks, GRU, simple recurrent neural networks (RNN) and Group Method of Data Handling (GMDH). The experimental investigations of intelligent methods for the problem of CISCO share prices were carried out and the efficiency of forecasting methods was estimated and compared. It was established that method GMDH had the best forecasting accuracy compared to other methods in the problem of share prices forecasting.

Author Biographies

Yuriy Zaychenko, Educational and Scientific Complex “Institute for Applied System Analysis” of the National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv

Yuriy Zaychenko,

Doctor of Technical Sciences, a professor at the Department of the Mathematical Methods of System Analysis of Educational and Scientific Complex “Institute for Applied System Analysis” of the National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv, Ukraine.

Galib Hamidov, “Azershig” company, Baku

Galib Hamidov,

Ph.D., the head of Information Technologies Department of “Azershig” company, Baku, Azerbaijan.

Aydin Gasanov, National Pedagogical Dragomanov University, Kyiv

Aydin Gasanov,

Doctor of Technical Sciences, a professor at the Department of Computer Engineering and Educational Measurements of the National Pedagogical Dragomanov University, Kyiv, Ukraine.

References

S. Hochreiter and J. Schmidhuber, “LONG SHORT-TERM MEMORY”, Neural Computation, no. 9, pp. 1735–1780, 1997.

Josef Hochreiter, DIPLOMARBEIT IM FACH INFORMATIK. Untersuchungen zu dynamischen neuronalen Netzen. Munchen, Germany: Technische Universi at Munchen, 1991, 74 p.

J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities”, Proc. of the National Academy of Sciences USA, vol. 79, pp. 2554–2558, 1982.

Y. Cheung, “A new recurrent radial basis function network”, Neural Information Proceeding, ICONIP’02, vol. 2, pp. 1032–1036, 2002.

V. Baier, “Motion Perception with Recurrent Self-Organizing Maps Based Models”, Proc. of IJCNN’05, Monreal, Canada, 2005, July 31–Aug. 4, pp. 1182–1186.

Y.-P. Chen and J.-S.Wang, “A Novel Neural Network with Minimal Representation for Dynamic System Identification”, Proceeding of IJCNN’04, Budapest, 2004, pp. 849–854.

Y. Bengio, P. Siamard, and P. Frasconi, Learning Long-Term Dependencies With Gradient Descent is Difficult, 1994.

S. Preeti, R. Bala, and R. Singh, “Financial and Non-Stationary Time Series Forecasting using LSTM Recurrent Neural Network for Short and Long Horizon”, 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT).

T. Fischer and C. Krauss, “Deep Learning with Long Short-Term Memory Networks for Financial Market Predictions”, European Journal of Operational Research, no. 270, p. 654–669, 2018.

W. Wei and P. Li, “Multi-Channel LSTM with Different Time Scales for Foreign Exchange Rate Prediction”, Proceedings of the international conference on Advanced Information Science and System, 2019.

Yu. P. Zaychenko, Fundamentals of Intelligent systems design, (in Ukrainian). Kiev: Publ. house “Slovo”, 2004, 352 p.

M. Zgurovsky and Yu. Zaychenko, Fundamentals of computational intelligence-System approach. Springer, 2016, 275 p.

S. Nikolenko, A. Kadurin, and E. Arhangelskaya, Deep learning. Involving into the world of neural networks, (in Russian). Saint- Petersburg: “Peter”, 2018, 480 p.

S. Heyken, Neural networks, Full course, (in Russian). Moscow: Publ. House “Williams”, 2006, 1104 p.

S. Osovsky, Neural networks for information processing, (in Russian). Moscow: Finance and Statistics, 2002, 344 p.

Recurrent neural networks. [Online]. Available: https://neerc.ifmo.ru/wiki/index.php?title=Рекуррентные_нейронные_сети

Understanding LSTM Networks. [Online]. Available: https://colah.github.io/posts/2015-08-Understanding-LSTMs/

O.M. Riznyk, “Dynamic recurrent neural networks”, (in Ukrainian), Mathematical machines and systems, no. 3, pp. 3–26, 2009.

F.M. Gafarov and A.F. Galimyanov, Artificial neural networks and their applications, (in Russian). Kazan: printed in Kazan University, 2018.

F. Gers and J. Schmidhuber, Recurrent nets that time and count, 2000.

K. Yao et al., Depth-Gated Recurrent Neural Networks, 2015, 5 p.

J.Koutnik, K. Greff, F. Gomez, and F. Schmidhuber, A Clockwork RNN. Switzerland, 2014, 9 p.

R. Jozefowicz, W. Zaremba, and I. Sutskever, An Empirical Exploration of Recurrent Network Architectures, 2015, 9 p.

Downloads

Published

2021-09-14

Issue

Section

Theoretical and applied problems of intellectual systems for decision making support