FAST NEURAL NETWORK AND ITS ADAPTIVE LEARNING IN CLASSIFICATION PROBLEMS

Authors

  • Ye. V. Bodyanskiy Kharkiv National University of Radio Electronics, Kharkiv, Ukraine, Ukraine
  • Ye. O Shafronenko Kharkiv National University of Radio Electronics, Kharkiv, Ukraine, Ukraine
  • F. A. Brodetskyi Kharkiv National University of Radio Electronics, Kharkiv, Ukraine, Ukraine
  • O. S. Tanianskyi Kharkiv National University of Radio Electronics, Kharkiv, Ukraine, Ukraine

DOI:

https://doi.org/10.15588/1607-3274-2025-3-5

Keywords:

Adaptive learning, classification, fast neural network, support vector machine

Abstract

Context. To solve a wide class of information processing tasks and, above all, pattern recognition under conditions of significant nonlinearity, artificial neural networks have become widely used, due to their universal approximating properties and ability to learn based on training training samples. Deep neural networks have become the most widespread, which indeed demonstrate very high recognition quality, but require extremely large amounts of training data, which are not always available. Under these conditions, the so-called least squares support vector machines can be effective. They do not require large amounts of training samples but can be trained only in batch mode and are quite cumbersome in numerical implementation. Therefore, the problem of training LS-SVM in sequential mode under conditions of significant non-stationarity of data that are sequentially fed online to the neural network for processing is quite relevant.
Objective. The aim of the work is to introduce an approach to adaptive learning of LS-SVM, which allows us to abandon the
conversion of images into vector signals.
Method. An approach for image recognition using a least squares support vector machine (LS-SVM) is proposed under conditions when data for processing is received in a sequential online mode. The advantage of the proposed approach is that reduces the time to solve the image recognition problem and allows the implementation of the learning process on non-stationary training samples. A feature of the proposed method is computational simplicity and high speed since the number of neurons in the network does not change over time, i.e., the architecture remains fixed during the tuning process.
Results. The proposed approach to adaptive learning of LS-SVM simplifies the numerical implementation of the neural network
and allows for an increase in the speed of information processing and, above all, the tuning of its synaptic weights.
Conclusions. The problem of pattern recognition using the least squares support vector machine (LS-SVM) is considered under
conditions when data for processing is received in a sequential online mode. The training process is implemented on a sliding window, which leads to the fact that the number of neurons in the network does not change over time, i.e. the architecture remains fixed during the tuning process. This approach simplifies the numerical implementation of the system and allows the training process to be implemented on non-stationary training samples. The possibility of training in situations where training images are given not only in vector form but also in matrix form allows us to abandon the conversion of images into vector signals

Author Biographies

Ye. V. Bodyanskiy, Kharkiv National University of Radio Electronics, Kharkiv, Ukraine

Dr. Sc., Professor at the Department of Artificial Intelligence

Ye. O Shafronenko, Kharkiv National University of Radio Electronics, Kharkiv, Ukraine

Assistant at the Department of Media Engineering and Information Radio Electronic Systems

F. A. Brodetskyi, Kharkiv National University of Radio Electronics, Kharkiv, Ukraine

PhD Student, Assistant at the Department of Informatics

O. S. Tanianskyi , Kharkiv National University of Radio Electronics, Kharkiv, Ukraine

Postgraduate student at the Department of Informatics

References

Goodfellow I., Begin J. and Courville A. Deep learning. The MIT Press, 2016.

Graupe D. Deep learning neural networks: design and case studies. World Scientific Publishing Company, 2016. https://doi.org/10.1142/10190

C. C. Aggarwal et al. Neural networks and deep learning. Cham, Springer, 2018, Т. 10, № 978. https://doi.org/10.1007/978-3-319-94463-0

Poggio, T. Girosi F. Networks for approximation and learning, Proceedings of the IEEE, 1990, Т. 78, №. 9. pp. 1481– 1497.

Haykin S. Neural networks: а comprehensive foundation. Prentice Hall PTR, 2004, Т. 2, 1994.

Vapnik V. N. The Nature of Statistical Learning Theory. New York, Springer, 1995.

Cortes C. and Vapnik V. Support-vector networks, Machine Learning, Sep. 1995, Vol. 20, No. 3, pp. 273–297, https://doi.org/10.1007/bf00994018.

Steinwart I. and Christmann A. Support Vector Machines. New York, Springer, 2008.

Zahirniak D. R., Chapman R., Rogers S. K., Suter B. W., Kabriski M., and Pyatti V. Pattern recognition using radial basis function network, Aerospace Application of Artificial Intelligence, 1990, pp. 249–260.

Vandewalle J., Moor B. D., Gestel T. V., Brabanter J. D., and Suykens J. A. K. Least Squares Support Vector Machines. World Scientific Publishing Company, 2003.

Bodyanskiy Y., Deineko, A. Brodetskyi F., and Kosmin D. Adaptive least-squares support vector machine and its online learning, CEUR Workshop Proceedings, Nov. 2020, vol. 2762, Art. no. 3. http://ceur-ws.org/Vol- 2762/paper3.pdf

Xiao H., Rasul K., Vollgraf R. Fashion-mist: a novel image dataset for benchmarking machine learning algorithms. Available: https://arxiv.org/abs/1708.07747

Published

2025-09-22

How to Cite

Bodyanskiy, Y. V. ., Shafronenko, Y. O., Brodetskyi, F. A. ., & Tanianskyi , O. S. (2025). FAST NEURAL NETWORK AND ITS ADAPTIVE LEARNING IN CLASSIFICATION PROBLEMS. Radio Electronics, Computer Science, Control, (3), 45–51. https://doi.org/10.15588/1607-3274-2025-3-5

Issue

Section

Neuroinformatics and intelligent systems