FAST NEURAL NETWORK AND ITS ADAPTIVE LEARNING IN CLASSIFICATION PROBLEMS
DOI:
https://doi.org/10.15588/1607-3274-2025-3-5Keywords:
Adaptive learning, classification, fast neural network, support vector machineAbstract
Context. To solve a wide class of information processing tasks and, above all, pattern recognition under conditions of significant nonlinearity, artificial neural networks have become widely used, due to their universal approximating properties and ability to learn based on training training samples. Deep neural networks have become the most widespread, which indeed demonstrate very high recognition quality, but require extremely large amounts of training data, which are not always available. Under these conditions, the so-called least squares support vector machines can be effective. They do not require large amounts of training samples but can be trained only in batch mode and are quite cumbersome in numerical implementation. Therefore, the problem of training LS-SVM in sequential mode under conditions of significant non-stationarity of data that are sequentially fed online to the neural network for processing is quite relevant.
Objective. The aim of the work is to introduce an approach to adaptive learning of LS-SVM, which allows us to abandon the
conversion of images into vector signals.
Method. An approach for image recognition using a least squares support vector machine (LS-SVM) is proposed under conditions when data for processing is received in a sequential online mode. The advantage of the proposed approach is that reduces the time to solve the image recognition problem and allows the implementation of the learning process on non-stationary training samples. A feature of the proposed method is computational simplicity and high speed since the number of neurons in the network does not change over time, i.e., the architecture remains fixed during the tuning process.
Results. The proposed approach to adaptive learning of LS-SVM simplifies the numerical implementation of the neural network
and allows for an increase in the speed of information processing and, above all, the tuning of its synaptic weights.
Conclusions. The problem of pattern recognition using the least squares support vector machine (LS-SVM) is considered under
conditions when data for processing is received in a sequential online mode. The training process is implemented on a sliding window, which leads to the fact that the number of neurons in the network does not change over time, i.e. the architecture remains fixed during the tuning process. This approach simplifies the numerical implementation of the system and allows the training process to be implemented on non-stationary training samples. The possibility of training in situations where training images are given not only in vector form but also in matrix form allows us to abandon the conversion of images into vector signals
References
Goodfellow I., Begin J. and Courville A. Deep learning. The MIT Press, 2016.
Graupe D. Deep learning neural networks: design and case studies. World Scientific Publishing Company, 2016. https://doi.org/10.1142/10190
C. C. Aggarwal et al. Neural networks and deep learning. Cham, Springer, 2018, Т. 10, № 978. https://doi.org/10.1007/978-3-319-94463-0
Poggio, T. Girosi F. Networks for approximation and learning, Proceedings of the IEEE, 1990, Т. 78, №. 9. pp. 1481– 1497.
Haykin S. Neural networks: а comprehensive foundation. Prentice Hall PTR, 2004, Т. 2, 1994.
Vapnik V. N. The Nature of Statistical Learning Theory. New York, Springer, 1995.
Cortes C. and Vapnik V. Support-vector networks, Machine Learning, Sep. 1995, Vol. 20, No. 3, pp. 273–297, https://doi.org/10.1007/bf00994018.
Steinwart I. and Christmann A. Support Vector Machines. New York, Springer, 2008.
Zahirniak D. R., Chapman R., Rogers S. K., Suter B. W., Kabriski M., and Pyatti V. Pattern recognition using radial basis function network, Aerospace Application of Artificial Intelligence, 1990, pp. 249–260.
Vandewalle J., Moor B. D., Gestel T. V., Brabanter J. D., and Suykens J. A. K. Least Squares Support Vector Machines. World Scientific Publishing Company, 2003.
Bodyanskiy Y., Deineko, A. Brodetskyi F., and Kosmin D. Adaptive least-squares support vector machine and its online learning, CEUR Workshop Proceedings, Nov. 2020, vol. 2762, Art. no. 3. http://ceur-ws.org/Vol- 2762/paper3.pdf
Xiao H., Rasul K., Vollgraf R. Fashion-mist: a novel image dataset for benchmarking machine learning algorithms. Available: https://arxiv.org/abs/1708.07747
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Ye. V. Bodyanskiy, Ye. O Shafronenko, F. A. Brodetskyi, О. S. Tanianskyi

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.