Generative time series model based on encoder-decoder architecture

Authors

  • Nadezhda Nedashkovskaya Educational and Scientific Complex "Institute for Applied System Analysis" of the National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Kyiv, Ukraine https://orcid.org/0000-0002-8277-3095
  • Dmytro Androsov Educational and Scientific Complex "Institute for Applied System Analysis" of the National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Kyiv, Ukraine

DOI:

https://doi.org/10.20535/SRIT.2308-8893.2022.1.08

Keywords:

prediction, variational autoencoder, GRU recurrent neural network, neural ordinary differential equation, latent space, nonstationary time series

Abstract

Encoder-decoder neural network models have found widespread use in recent years for solving various machine learning problems. In this paper, we investigate the variety of such models, including the sparse, denoising and variational autoencoders. To predict non-stationary time series, a generative model is presented and tested, which is based on a variational autoencoder, GRU recurrent networks, and uses elements of neural ordinary differential equations. Based on the constructed model, the system is implemented in the Python3 environment, the TensorFlow2 framework and the Keras library. The developed system can be used for modeling continuous time-dependent processes. The system minimizes a human factor in the process of time series analysis, and presents a high-level modern interface for fast and convenient construction and training of deep models.

Author Biographies

Nadezhda Nedashkovskaya, Educational and Scientific Complex "Institute for Applied System Analysis" of the National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Kyiv

Nadezhda I. Nedashkovskaya,

Doctor of Technical Sciences, an associate professor at the Department of Mathematical Methods of Systems Analysis of Educational and Scientific Complex "Institute for Applied System Analysis" of the National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Kyiv, Ukraine.

Dmytro Androsov, Educational and Scientific Complex "Institute for Applied System Analysis" of the National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Kyiv

Dmytro V. Androsov,

a graduate student at Educational and Scientific Complex "Institute for Applied System Analysis" of the National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Kyiv, Ukraine.

References

P.I. Bidyuk, V.D. Romanenko, and O.L. Timoshchuk, Time series analysis. Kyiv: Polytechnika, NTUU “KPI”, 2013.

Terence C. Mills, “Chapter 3 - ARMA Models for Stationary Time Series”, Applied Time Series Analysis. A Practical Guide to Modeling and Forecasting, pp. 31–56, 2019. Available: https://doi.org/10.1016/B978-0-12-813117-6.00003-X.

Terence C. Mills, “Chapter 4 - ARIMA Models for Nonstationary Time Series”, Applied Time Series Analysis. A Practical Guide to Modeling and Forecasting, pp. 57–69, 2019. Available: https://doi.org/10.1016/B978-0-12-813117-6.00004-1.

Amélie Charles and Olivier Darné, “The accuracy of asymmetric GARCH model estimation”, International Economics, vol. 157, pp. 179–202, May 2019. Available: https://doi.org/10.1016/j.inteco.2018.11.001

O.L. Tymoshchuk, V.H. Huskova, and P.I. Bidyuk, “A combined approach to modeling nonstationary heteroscedastic processes”, Radio Electronics, Computer Science, Control, (2), pp. 80–89, 2019. Available: https://doi.org/10.15588/10.15588/ 1607-3274-2019-2-9.

P. Bidyuk, T. Prosyankina-Zharova, and O. Terentiev, “Modelling nonlinear nonstationary processes in macroeconomy and finances”, Advances in Computer Science for Engineering and Education, vol. 754, pp. 735–745, 2019. Available: https://doi.org/10.1007/978-3-319-91008-6_72.

Anoop S. Kumar and S. Anandarao, “Volatility spillover in crypto-currency markets: Some evidences from GARCH and wavelet analysis”, Physica A: Statistical Mechanics and its Applications, vol. 524, pp. 448–458,15 June 2019. Available: https://doi.org/10.1016/j.physa.2019.04.154.

Aurelien Geron, Hands-On Machine Learning with Scikit-Learn and TensorFlow. Sebastopol, CA: O’Reilly Media Inc., 2017, 760 p.

Alex Sherstinsky, “Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) network”, Physica D: Nonlinear Phenomena, vol. 404, 132306, March 2020. Available: https://doi.org/10.1016/j.physd.2019.132306.

Mikel Canizo, Isaac Triguero, Angel Conde, and Enrique Onieva, “Multi-head CNN–RNN for multi-time series anomaly detection: an industrial case study”, Neurocomputing, vol. 363, pp. 246–260, 21 October 2019. Available: https://doi.org/ 10.1016/j.neucom.2019.07.034.

Kyunghyun Cho et al., “Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation”, Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), October 25–29, 2014, pp. 1724–1734, Available: https://aclanthology.org/D14-1179.pdf.

Klaus Greff et al., LSTM: A Search Space Odyssey. 2015. Available: arXiv:1503.04069.

Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning. Massachusetts London, England: The MIT Press Cambridge, 2016, 802 p.

Henrik Brink, Joseph Richards, and Mark Fetherolf, Machine Learning. StPb.: Piter, 2017, 336 p.

Jake VanderPlas, Python Data Science Handbook. O’Reilly Media, Inc, 2016, 576 p.

Z. Deng et al., “Sparse stacked autoencoder network for complex system monitoring with industrial applications”, Chaos, Solitons & Fractals, vol. 137, August 2020. Available: https://doi.org/10.1016/j.chaos.2020.109838.

H. Zhu et al., “Stacked pruning sparse denoising autoencoder based intelligent fault diagnosis of rolling bearings”, Applied Soft Computing, vol. 88, March 2020. Available: https://doi.org/10.1016/j.asoc.2019.106060.

N.I. Nedashkovskaya, “Method for Evaluation of the Uncertainty of the Paired Comparisons Expert Judgements when Calculating the Decision Alternatives Weights”, Journal of Automation and Information Sciences, vol. 47, no. 10, pp. 69–82, 2015. Available: https://doi.org/10.1615/JAutomatInfScien.v47.i10.70.

N.I. Nedashkovskaya, “A system approach to decision support on basis of hierarchical and network models”, System research and information technologies, no. 1, pp. 7–18, 2018. Available: https://doi.org/10.20535/SRIT.2308-8893.2018.1.01 .

J.Yu, “Manifold regularized stacked denoising autoencoders with feature selection”, Neurocomputing, vol. 358, pp. 235–245, 17 September 2019. Available: https://doi.org/10.1016/j.neucom.2019.05.050.

N. Abiri et al., “Establishing strong imputation performance of a denoising autoencoder in a wide range of missing data problems”, Neurocomputing, vol. 365, pp. 137–146, 6 November 2019. Available: https://doi.org/10.1016/j.neucom. 2019.07.065.

Diederik P. Kingma and Max Welling, Auto-Encoding Variational Bayes, 2014. Available: arXiv:1312.6114v10.

Ricky T.Q. Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud, Neural ordinary differential equations. NeurIPS, 2018.

Yulia Rubanova, Ricky T.Q. Chen, and David K. Duvenaud, Latent ordinary differential equations for irregularly-sampled time series. NeurIPS, 2019.

Calypso Herrera, Florian Krach, and Josef Teichmann, Neural Jump Ordinary Differential Equations: Consistent Continuous-Time Prediction and Filtering, 2020. Available: arXiv:2006.04727.

J. Lu et al., “Neural-ODE for pharmacokinetics modeling and its advantage to alternative machine learning models in predicting new dosing regimens”, iScience, vol. 24, issue 7, 23 July 2021. Available: https://doi.org/10.1016/j.isci.2021.102804.

Downloads

Published

2022-04-25

Issue

Section

Theoretical and applied problems of intellectual systems for decision making support