NEURAL ORDINARY DIFFERENTIAL EQUATIONS FOR TIME SERIES RECONSTRUCTION
DOI:
https://doi.org/10.15588/1607-3274-2023-4-7Keywords:
neural ordinary differential equations, deep neural networks, variational autoencoders, recurrent neural networks, long term short memory networksAbstract
Context. Neural Ordinary Differential Equations is a deep neural networks family that leverage numerical methods approaches for solving the problem of time series reconstruction, given small amount of unevenly distributed samples.
Objective. The goal of the following research is the synthesis of a deep neural network that is able to solve input signal reconstruction and time series extrapolation task.
Method. The proposed method exhibits the benefits of solving time series extrapolation task over forecasting one. A model that implements encoder-decoder architecture with differential equation solving in latent space, is proposed. The latter approach was proven to demonstrate outstanding performance in solving time series reconstruction task given a small percentage of noisy and uneven distributed input signals. The proposed Latent Ordinary Differential Equations Variational Autoencoder (LODE-VAE) model was benchmarked on synthetic non-stationary data with added white noise and randomly sampled with random intervals between each signal.
Results. The proposed method was implemented via deep neural network to solve time series extrapolation task.
Conclusions. The conducted experiments have confirmed that proposed model solves the given task effectively and is recommended to apply it to solving real-world problems that require reconstructing dynamics of non-stationary processes. The prospects for further research may include the process of computational optimization of proposed models, as well as conducting additional experiments involving different baselines, e. g. Generative Adversarial Networks or attention Networks.
References
Bidyuk P. I., Romanenko V. D., Timoshchuk O. L. Time series analysis. Kyiv, Polytechnika, 2013, 230 p. (In Ukrainian)
Parfenenko Y. V., Shendryk V. V., Kholiavka Y. P., Pavlenko P. M. Comparison of short-term forecasting methods of electricity consumption in microgrids, Radio Electronics, Computer Science, Control, 2023, № 1, pp. 14–23. DOI: https://doi.org/10.15588/1607-3274-2023-1-2
Terence C. M. Chapter 3 – ARMA Models for Stationary Time Series, Applied Time Series Analysis. A Practical Guide to Modeling and Forecasting. Cambridge, Academic Press, 2019, Ch. 3, pp. 31–56. DOI: https://doi.org/10.1016/B978-0-12-813117-6.00003-X.
Terence. C. M. Chapter 4 – ARIMA Models for Nonstationary Time Series, Applied Time Series Analysis. A Practical Guide to Modeling and Forecasting. Cambridge, Academic Press, 2019, Ch. 4, pp. 57–69. DOI: https://doi.org/10.1016/B978-0-12-813117-6.00004-1.
Charles A., Darné O. The accuracy of asymmetric GARCH model estimation, International Economics, 2019, Vol. 157, pp. 179–202. DOI: https://doi.org/10.1016/j.inteco.2018.11.001
Bidyuk P., Prosyankina-Zharova T., Terentiev O. Modeling nonlinear nonstationary processes in macroeconomy and finances, Advances in Computer Science for Engineering and Education, 2019, Vol. 754, pp. 735–745. DOI: https://doi.org/10.1007/978-3-319-91008-6_72.
Kumar A. S., Anandarao S. Volatility spillover in cryptocurrency markets: Some evidences from GARCH and wavelet analysis, Physica A: Statistical Mechanics and its Applications, 2019, Vol. 524, pp. 448–458. DOI: https://doi.org/10.1016/j.physa.2019.04.154.
Geron A. Hands-On Machine Learning with Scikit-Learn and TensorFlow. Sebastopol: O’Reilly Media Inc., 2017, 760 p.
Goodfellow I., Bengio Y., Courville A. Deep Learning. Cambridge, The MIT Press, 2016, 802 p.
Whittle P. Hypothesis Testing in Time Series Analysis. Stockholm, Almquist and Wicksell, 1951, 187 p.
Box G. E.P., Jenkins G. M. Time Series Analysis: Forecasting and Control. San-Francisco, Holden-Day, 1976, 575 p.
Hochreiter S., Schmidhuber J. Long short-term memory, Neural computation, 1997, Vol. 9, № 8, pp. 1735–1780.
Cho K., Merrienboer B. van, Gulcehre C., Bahdanau D., Bougares F., Schwenk H., Bengio Y. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. EMNLP 2014: Conference on Empirical Methods in Natural Language Processing, Doha, 25–29 October 2014: proceedings. Doha, Association for Computational Linguistics, 2014, pp. 1724–1734. DOI: https://doi.org/10.48550/arXiv.1406.1078
Chen R. T.Q., Rubanova Y., Bettencourt J., Duvenaud D. Neural ordinary differential equations [Electronic resource]. Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems, Vancouver, 3–8 December 2018: proceedings. Access mode: https://proceedings.neurips.cc/paper_files/paper/2018/file/69 386f6bb1dfed68692a24c8686939b9-Paper.pdf
Lu J., Deng K., Zhang X., Liu G., Guan Y. Neural-ODE for pharmacokinetics modeling and its advantage to alternative machine learning models in predicting new dosing regimens, Science, 2021, Vol. 24, Issue 7, pp. 1–13. DOI: https://doi.org/10.1016/j.isci.2021.102804.
De Brouwer E., Simm J., Arany A., Moreau Y. GRU-ODEBayes: Continuous modeling of sporadically-observed time series [Electronic resource], Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems, Vancouver, 8–14 December 2019: proceedings. Access mode: https://proceedings.neurips.cc/paper_files/paper/2019/file/45 5cb2657aaa59e32fad80cb0b65b9dc-Paper.pdf
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Д. В. Андросов
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.