ENSEMBLE OF ADAPTIVE PREDICTORS FOR MULTIVARIATE NONSTATIONARY SEQUENCES AND ITS ONLINE LEARNING
DOI:
https://doi.org/10.15588/1607-3274-2023-4-9Keywords:
ensemble, metamodels, boosting, bagging, multivariate signals, nonstationarity, forecastingAbstract
Context. In this research, we explore an ensemble of metamodels that utilizes multivariate signals to generate forecasts. The ensemble includes various traditional forecasting models such as multivariate regression, exponential smoothing, ARIMAX, as well as nonlinear structures based on artificial neural networks, ranging from simple feedforward networks to deep architectures like LSTM and transformers.
Objective. A goal of this research is to develop an effective method for combining forecasts from multiple models forming metamodels to create a unified forecast that surpasses the accuracy of individual models. We are aimed to investigate the effectiveness of the proposed ensemble in the context of forecasting tasks with nonstationary signals.
Method. The proposed ensemble of metamodels employs the method of Lagrange multipliers to estimate the parameters of the metamodel. The Kuhn-Tucker system of equations is solved to obtain unbiased estimates using the least squares method. Additionally, we introduce a recurrent form of the least squares algorithm for adaptive processing of nonstationary signals.
Results. The evaluation of the proposed ensemble method is conducted on a dataset of time series. Metamodels formed by combining various individual models demonstrate improved forecast accuracy compared to individual models. The approach shows effectiveness in capturing nonstationary patterns and enhancing overall forecasting accuracy.
Conclusions. The ensemble of metamodels, which utilizes multivariate signals for forecast generation, offers a promising approach to achieve better forecasting accuracy. By combining diverse models, the ensemble exhibits robustness to nonstationarity and improves the reliability of forecasts.
References
Freund Y., Robert E. Schapire A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting [Electronic resource], Journal of Computer and System Sciences, 1997, Vol. 55, No. 1, P. 119–139. Mode of access: https://doi.org/10.1006/jcss.1997.1504
Schwenk H., Bengio Y. Boosting Neural Networks [Electronic resource], Neural Computation, 2000, Vol. 12, No. 8, pp. 1869–1887. Mode of access: https://doi.org/10.1162/089976600300015178 .
Jiang W. Process consistency for AdaBoost [Electronic resource], The Annals of Statistics, 2003, Vol. 32, No. 1, pp. 13–29. Mode of access: https://doi.org/10.1214/aos/1079120128
Li X., Wang L., Sung E. AdaBoost with SVM-based component classifiers [Electronic resource], Engineering Applications of Artificial Intelligence, 2008, Vol. 21, No. 5, pp. 785–795. Mode of access: https://doi.org/10.1016/j.engappai.2007.07.001 .
Hastie T., Rosset S., Zhu J., & Zou H. Multi-class AdaBoost [Electronic resource], Statistics and Its Interface, 2009, Vol. 2, No. 3, pp. 349–360. Mode of access: https://doi.org/10.4310/sii.2009.v2.n3.a8
Zhang C., Ma Y. ed. by Ensemble Machine Learning [Electronic resource]. Boston, MA, Springer US, 2012. Mode of access: https://doi.org/10.1007/978-1-4419-9326-7
Ying C., Qi-Guang M., Jia-Chen L., & Lin G. Advance and Prospects of AdaBoost Algorithm [Electronic resource], Acta Automatica Sinica, 2013, Vol. 39, No. 6, pp. 745–758. Mode of access: https://doi.org/10.1016/s18741029(13)60052-x .
Zhou Z.-H. Ensemble Learning [Electronic resource], Machine Learning. Singapore, 2021, pp. 181–210. Mode of access: https://doi.org/10.1007/978-981-15-1967-3_8 .
Hansen L. K., Salamon P. Neural network ensembles [Electronic resource], IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990, Vol. 12, No. 10, pp. 993–1001. Mode of access: https://doi.org/10.1109/34.58871
Wolpert D. H. Stacked generalization [Electronic resource], Neural Networks, 1992, Vol. 5, No. 2, pp. 241–259. Mode of access: https://doi.org/10.1016/s0893-6080(05)80023-1 .
Sharkey A. J. C. On Combining Artificial Neural Nets [Electronic resource], Connection Science, 1996, Vol. 8, No. 3–4, pp. 299–314. Mode of access: https://doi.org/10.1080/095400996116785
Bodyanskiy Y., Otto P., Pliss I., & Popov S. An Optimal Algorithm for Combining Multivariate Forecasts in Hybrid Systems [Electronic resource], Lecture Notes in Computer Science. Berlin, Heidelberg, 2003, pp. 967–972. Mode of access: https://doi.org/10.1007/978-3-540-45226-3_132 .
Bodyanskiy Y., Popov S. Fuzzy Selection Mechanism for Multimodel Prediction [Electronic resource], Lecture Notes in Computer Science. Berlin, Heidelberg, 2004, pp. 772– 778. Mode of access: https://doi.org/10.1007/978-3-54030133-2_101 .
Lipianina-Honcharenko K., Lukasevych-Krutnyk I., ButrynBoka N., Sachenko A., & Grodskyi S. Intelligent Method for Identifying the Fraudulent Online Stores [Electronic resource], 2021 IEEE 8th International Conference on Problems of Infocommunications, Science and Technology (PIC S&T). Kharkiv, Ukraine, 5–7 October 2021, [S. l.], 2021. Mode of access: https://doi.org/10.1109/picst54195.2021.9772195
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Ye. V. Bodyanskiy, Kh. V. Lipianina-Honcharenko, A. O. Sachenko
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.