HYBRID MACHINE LEARNING TECHNOLOGIES FOR PREDICTING COMPREHENSIVE ACTIVITIES OF INDUSTRIAL PERSONNEL USING SMARTWATCH DATA

Authors

  • O. M. Pavliuk Silesian University of Technology, Gliwice, Poland, National University “Lviv Polytechnic”, Lviv, Ukraine, Ukraine
  • M. O. Medykovskyy Lviv Polytechnic National University Lviv, Ukraine, Ukraine
  • M. V. Mishchuk Lviv Polytechnic National University, Lviv, Ukraine, Ukraine
  • A. O. Zabolotna Lviv Polytechnic National University, Lviv, Ukraine, Ukraine
  • O. V. Litovska Lviv Polytechnic National University, Lviv, Ukraine, Ukraine

DOI:

https://doi.org/10.15588/1607-3274-2025-3-10

Keywords:

distributed system, smart watch, industrial personnel, basic classifier, complex activity, classification, prediction

Abstract

Context. In today’s industrial development, significant attention is paid to systems for recognizing and predicting human activity in real time. Such technologies are key to the transition from the concept of Industry 4.0 to Industry 5.0, as they allow for improved interaction between man and machine, as well as to ensure a higher level of safety, adaptability and efficiency of production processes. These approaches are particularly relevant in the field of internal logistics, where cooperation with autonomous vehicles requires a high level of coordination and adaptability.
Objective. To create a technological solution for the prompt detection and prediction of complex human activity in the internal logistics environment by using sensor data from smart watches. The main goal is to improve cooperation between employees and automated systems, increase occupational safety and efficiency of logistics processes.
Method. A decentralized data collection system using smart watches has been developed. A mobile application in Kotlin was created to capture sensor readings during a series of logistics actions performed by five workers. To process incomplete or distorted data, anomaly detection algorithms were applied, including STD, logarithmic transformation of STD, DBSCAN, and IQR, as well as smoothing methods such as moving average, weighted moving average, exponential smoothing, local regression, and Savitsky-Goley filter. The processed data were used to train models, with the employment of such advanced techniques as transfer learning, continuous wavelet transform, and classifier stacking.
Results. The pre-trained deep model with the DenseNet121 architecture was chosen as the base classifier, which showed an F1- metric of 91.01% in recognizing simple actions. Five neural network architectures (single- and multi-layer) with two data distribution strategies were tested to analyze complex activity. The highest accuracy – F1-metric 87.44% – was demonstrated by the convolutional neural network when using a joint approach to data distribution.
Conclusions. The results of the study indicate the possibility of applying the proposed technology for real-time recognition of complex human activities in intra-logistics systems based on data from smart-watch sensors, which will improve human-machine interaction and increase the efficiency of industrial logistics processes

Author Biographies

O. M. Pavliuk, Silesian University of Technology, Gliwice, Poland, National University “Lviv Polytechnic”, Lviv, Ukraine

PhD, Researcher, Department of Distributed Systems and Informatic Devices,                                               doctoral student at the Department of Automated Control Systems

M. O. Medykovskyy, Lviv Polytechnic National University Lviv, Ukraine

Dr. Sc., Professor, Department of Automated Control Systems

M. V. Mishchuk, Lviv Polytechnic National University, Lviv, Ukraine

Assistant, Department of Automated Control Systems

A. O. Zabolotna, Lviv Polytechnic National University, Lviv, Ukraine

Student, Department of Automated Control Systems

O. V. Litovska, Lviv Polytechnic National University, Lviv, Ukraine

Student, Department of Automated Control Systems

References

Li Haobo et al. Bi-LSTM Network for Multimodal Continuous Human Activity Recognition and Fall Detection [Electronic resource], IEEE Sensors Journal, 2020, Vol. 20,No. 3, pp. 1191–1201 DOI: 10.1109/JSEN.2019.2946095

Jaafar S. T., Mohammad M. Epileptic Seizure Detection using Deep Learning Approach [Electronic resource], UHD Journal of Science and Technology, 2019, Vol. 3, No. 2, P. 41. DOI: 10.21928/uhdjst.v3n2y2019.pp41–50

Butt Fatima Sajid et al. Fall Detection from Electrocardiogram (ECG) Signals and Classification by Deep Transfer Learning [Electronic resource], Information, 2021, Vol. 12, No. 2, P. 63. DOI: 10.3390/info12020063

Tzallas A. T., Tsipouras M. G., Fotiadis D. I. Epileptic Seizure Detection in EEGs Using Time-Frequency Analysis [Electronic resource], IEEE Transactions on Information Technology in Biomedicine, 2009, Vol. 13, No. 5, pp. 703– 710. DOI: 10.1109/titb.2009.2017939

Dhiman C., Vishwakarma D. K. A review of state-of-the-art techniques for abnormal human activity recognition [Electronic resource], Engineering Applications of Artificial Intelligence, 2019, Vol. 77, pp. 21–45. DOI: 10.1016/j.engappai.2018.08.014

Nadeem A. Jalal A., Kim K. Automatic human posture estimation for sport activity recognition with robust body parts detection and entropy markov model [Electronic resource], Multimedia Tools and Applications, 2021,Vol. 80, No. 14, pp. 21465–21498. DOI: 10.1007/s11042-021-10687-5

Zhuang Z., Xue Y. Sport-Related Human Activity Detection and Recognition Using a Smartwatch [Electronic resource], Sensors, 2019, Vol. 19, No. 22, P. 5001. DOI: 10.3390/s19225001

Kalpesh Jadhav et al. Human Physical Activities Based Calorie Burn Calculator Using LSTM [Electronic resource], Intelligent Cyber Physical Systems and Internet of Things. Cham, 2023, pp. 405–424. DOI: 10.1007/978-3-031-18497- 0_31

Castro-García J. A. [et al.]Towards Human Stress and Activity Recognition: A Review and a First Approach Based on Low-Cost Wearables [Electronic resource], Electronics, 2022, Vol. 11, No. 1, P. 155. DOI: 10.3390/electronics11010155

Mohsen S., Elkaseer A., Scholz S. G. Industry 4.0-Oriented Deep Learning Models for Human Activity Recognition [Electronic resource], IEEE Access, 2021, Vol. 9, pp. 150508–150521. DOI: 10.1109/access.2021.3125733

Niemann Friedrich et al. Context-Aware Human Activity Recognition in Industrial Processes [Electronic resource], Sensors, 2021, Vol. 22, No. 1, P. 134. DOI: 10.3390/s22010134

Autonomous Guided Vehicles for Smart Industries – The State-of-the-Art and Research Challenges [Electronic resource] / Rafal Cupek [et al.] // Lecture Notes in Computer Science. – Cham, 2020. – P. 330–343. DOI: 10.1007/978-3- 030-50426-7_25

Fang L., Yishui S., Wei Ch. Up and down buses activity recognition using smartphone accelerometer [Electronic resource], 2016 IEEE Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), Chongqing. China, 20–22 May 2016, [S. l.], 2016. DOI: 10.1109/itnec.2016.7560464

Setiaji B. R., Utama D. Q., Adiwijaya A. Smartphone Purchase Recommendation System Using the K-Nearest Neighbor (KNN) Algorithm [Electronic resource], JURNAL MEDIA INFORMATIKA BUDIDARMA, 2022, Vol. 6, No. 4, P. 2180. DOI: 10.30865/mib.v6i4.4753

Li Kenan et al. Applying Multivariate Segmentation Methods to Human Activity Recognition From Wearable Sensors’ Data [Electronic resource], JMIR mHealth and uHealth, 2019, Vol. 7, No. 2, P. e11201. DOI: 10.2196/11201

Zhang W., Zhao X., Li Z. A Comprehensive Study of Smartphone-Based Indoor Activity Recognition via Xgboost [Electronic resource], IEEE Access, 2019, Vol. 7, P. 80027– 80042 DOI: 10.1109/access.2019.2922974

Garcia-Ceja E., Galván-Tejada C. E., Brena R. Multi-view stacking for activity recognition with sound and accelerometer data [Electronic resource], Information Fusion, 2018, Vol. 40, pp. 45–56. DOI: 10.1016/j.inffus.2017.06.004

Tawosi V., Soufineyestani M., Sajedi H. Human activity recognition based on mobile phone sensor data using stacking machine learning classifiers [Electronic resource], International Journal of Digital Signals and Smart Systems, 2019, Vol. 3, No. 4, P. 204. DOI:

1504/ijdsss.2019.10027378

Alema Khatun M., Yousuf M. A. Human Activity Recognition Using Smartphone Sensor Based on Selective Classifiers [Electronic resource], 2020 2nd International Conference on Sustainable Technologies for Industry 4.0 (STI). Dhaka, Bangladesh, 19–20 December 2020, [S. l.], 2020. DOI: 10.1109/sti50764.2020.9350486

Gaud N., Rathore M., Suman U. Hybrid Deep LearningBased Human Activity Recognition (HAR) Using Wearable Sensors: An Edge Computing Approach [Electronic resource], Proceedings of Data Analytics and Management. Singapore, 2024, pp. 399–410. DOI: 10.1007/978-981-99- 6544-1_30

Pavliuk O., Mishchuk M., Strauss C. Transfer Learning Approach for Human Activity Recognition Based on Continuous Wavelet Transform [Electronic resource], Algorithms, 2023, Vol. 16, No. 2, P. 77. DOI: 10.3390/a16020077

Khan Y. A. et al. Classification of Human Motion Activities using Mobile Phone Sensors and Deep Learning Model [Electronic resource], 2022 8th International Conference on Advanced Computing and Communication Systems (ICACCS). Coimbatore, India, 25–26 March 2022, [S. l.], 2022. DOI: 10.1109/icaccs54159.2022.9785009

Shah Zainudin M. N. et al. Recognizing Complex Human Activities using Hybrid Feature Selections based on an Accelerometer Sensor [Electronic resource], International Journal of Technology, 2017, Vol. 8, No. 5, P. 968. DOI: 10.14716/ijtech.v8i5.879

Ryoo M. S., Aggarwal J. K. Recognition of Composite Human Activities through Context-Free Grammar Based Representation [Electronic resource], 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Volume 2 (CVPR'06). New York, NY, USA, [S. l.]. DOI: 10.1109/cvpr.2006.242

Ding G., Yao A. Temporal Action Segmentation with Highlevel Complex Activity Labels [Electronic resource], IEEE Transactions on Multimedia, 2022, pp. 1–12. DOI: 10.1109/tmm.2022.3231099

Li F. et al. Comparison of Feature Learning Methods for Human Activity Recognition Using Wearable Sensors [Electronic resource], Sensors, 2018, Vol. 18, No. 3, P. 679. DOI: 10.3390/s18020679

Tammvee M., Anbarjafari G. Human activity recognitionbased path planning for autonomous vehicles [Electronic resource] / Martin Tammvee, Gholamreza Anbarjafari, Signal, Image and Video Processing, 2020. DOI: 10.1007/s11760- 020-01800-6

Al-Amin Md et al. Action Recognition in Manufacturing Assembly using Multimodal Sensor Fusion [Electronic resource], Procedia Manufacturing, 2019, Vol. 39, pp. 158– 167. DOI: 10.1016/j.promfg.2020.01.288

Tao W., Leu M. C., Yin Z. Multi-modal recognition of worker activity for human-centered intelligent manufacturing [Electronic resource], Engineering Applications of Artificial Intelligence, 2020, Vol. 95, P. 103868. DOI: 10.1016/j.engappai.2020.103868

Suh S. et al. Worker Activity Recognition in Manufacturing Line Using Near-body Electric Field [Electronic resource], IEEE Internet of Things Journal, 2023, P. 1. DOI: 10.1109/jiot.2023.3330372

Abdel-Basset M. et al. ST-DeepHAR: Deep Learning Model for Human Activity Recognition in IoHT Applications [Electronic resource], IEEE Internet of Things Journal, 2020, P. 1. DOI: 10.1109/jiot.2020.3033430

Ziebinski Adam et al. Challenges Associated with Sensors and Data Fusion for AGV-Driven Smart Manufacturing [Electronic resource] / Adam Ziebinski [et al.] // Computational Science – ICCS 2021. – Cham, 2021. – P. 595–608. DOI: 10.1007/978-3-030-77970-2_45

Prots’ko I., Mishchuk M. Block-Cyclic Structuring of the Basis of Fourier Transforms Based on Cyclic Substitution [Electronic resource], Cybernetics and Systems Analysis, 2021, Vol. 57, No. 6, pp. 1008–1016. DOI: 10.1007/s10559- 021-00426-x

Vuong T. H., Doan T., Takasu A. Deep Wavelet Convolutional Neural Networks for Multimodal Human Activity Recognition Using Wearable Inertial Sensors [Electronic resource], Sensors, 2023, Vol. 23, No. 24, P. 9721. DOI: 10.3390/s23249721

Jalal L., Peer A. Emotion Recognition from Physiological Signals Using Continuous Wavelet Transform and Deep Learning [Electronic resource], HCI International 2022 – Late Breaking Papers. Multimodality in Advanced Interaction Environments. Cham, 2022, pp. 88–99. DOI: 10.1007/978-3-031-17618-0_8

Tavakkoli M., Nazerfard E., Amirmazlaghani M. Waveletdomain human activity recognition utilizing convolutional neural networks [Electronic resource], Journal of Ambient Intelligence and Smart Environments, 2023, pp. 1–14. DOI: 10.3233/ais-230174

Lu X., Ling Y., Liu S. Temporal Convolutional Network with Wavelet Transform for Fall Detection [Electronic resource], Journal of Sensors, 2022, Vol. 2022, pp. 1–19. DOI: 10.1155/2022/7267099

Izonin Ivan et al. A Two-Step Data Normalization Approach for Improving Classification Accuracy in the Medical Diagnosis Domain [Electronic resource], Mathematics, 2022, Vol. 10, No. 11, P. 1942. DOI: 10.3390/math10111942

Pavliuk O., Mishchuk M. A novel deep-learning model for human activity recognition based on continuous wavelet transform, CEUR Workshop Proceedings, 2022, Vol. 3302. pp. 236–245. https://ceur-ws.org/Vol-3302/paper14.pdf

Sikder N., Nahid Abdullah-Al KU-HAR: An open dataset for heterogeneous human activity recognition [Electronic resource], Pattern Recognition Letters, 2021, Vol. 146, pp. 46–54. DOI: 10.1016/j.patrec.2021.02.024

Pavliuk O., Mishchuk M. Smartwatch-Based Human Staff Activity Classification: A Use-Case Study in Internal Logistics Systems Utilizing AGVs [Electronic resource], 2024 IEEE International Conference on Big Data (BigData). Washington, DC, USA, 15–18 December 2024. – [S. l.], 2024, pp. 8198–8207. DOI: 10.1109/bigdata62323.2024.10825909

Downloads

Published

2025-09-22

How to Cite

Pavliuk, O. M. ., Medykovskyy, M. O. ., Mishchuk, M. V., Zabolotna, A. O. ., & Litovska, O. V. . (2025). HYBRID MACHINE LEARNING TECHNOLOGIES FOR PREDICTING COMPREHENSIVE ACTIVITIES OF INDUSTRIAL PERSONNEL USING SMARTWATCH DATA. Radio Electronics, Computer Science, Control, (3), 96–111. https://doi.org/10.15588/1607-3274-2025-3-10

Issue

Section

Neuroinformatics and intelligent systems