METHOD FOR ANALYZING INPUT DATA FROM GEAR VIBRATIONS
DOI:
https://doi.org/10.15588/1607-3274-2025-2-13Keywords:
1D convolutional neural networks, multilayer perceptron, stochastic gradient descent, loss function, engine vibration analysis, helicopter performance, signal classification, neural network, machine learningAbstract
Context. The paper considers the problem of analyzing large data vectors for analyzing helicopter engine performance. This issue is crucial for improving the reliability and efficiency of modern aviation technologies.
Objective. To create a method for analyzing engine vibration data to achieve accurate classification of engine states based on vibration signals.
Method. The input data is analyzed, and a decision is made to create a neural network that is trained to recognize the class of the input vector. The neural network can work immediately and be configured for further training based on similar data. The program was implemented using a classical neural network method. The optimal weights and offsets are calculated with derivatives to minimize the loss function. The stochastic gradient descent (SGD) algorithm was used for optimization, and different activation functions were tested to find the best configuration. Choosing the right activation functions ensured maximum performance.
Results. The graphs of the input vectors show that vectors from the first class had more peaks, which helped facilitate classification. After applying this method, the accuracy was about 70–75%, which was insufficient for the task. To improve this, we enhanced the model structure and reconfigured the activation functions. With the new method, the neural network can classify the input vector with 100% accuracy.
Conclusions. This study presents an approach to analyzing engine vibration data for assessing performance. The scientific novelty lies in adapting a multilayer perceptron (MLP) for classifying vibration signals. The research shows that high accuracy can be achieved without deep architectures by optimizing the MLP. This method is universally applicable, eliminating additional model adaptation costs, which is crucial for industrial use. The practical significance is demonstrated through software and experiments, proving the effectiveness of the MLP for performance monitoring when model parameters and activation functions are properly adjusted
References
Pan J. et al. LiftingNet: a novel deep learning network with layerwise feature learning from noisy mechanical data for fault classification, IEEE Transactions on Industrial Electronics, 2017, Vol. 65, Iss. 6, pp. 4973–4982.
Abdi H., Williams L. J. Principal component analysis, Wiley Interdisciplinary Reviews Computational Statistics, 2010, Vol. 2, Iss. 4, pp. 433–459.
Lee T. W. Independent Component Analysis, Independent Component Analysis : Theory and Applications. Boston, MA, 1998, pp. 27–66.
Lee J-M. et al. Nonlinear process monitoring using kernel principal component analysis, Chemical Engineering Science, 2004, Vol. 59, Iss. 1, pp. 223–234.
Lin M., Chen Q., Yan S. Network in Network, Proceedings of the 2nd International Conference on Learning Representations, ICLR, 2014, 14–16 Apr. Banff, Canada, 2014. Access mode: https://www.semanticscholar.org/reader/5e83ab70d0cbc003471e87ec306d27d9c80ecb16.
Hsueh Yu-Min et al. Fault diagnosis system for induction motors by CNN using empirical wavelet transform, Symmetry, 2019, Vol. 11, Iss. 10, pp. 1–15.
Xu Juan et al. Zero-shot learning for compound fault diagnosis of bearings [Electronic resource], Expert Systems with Applications, 2022, Vol. 190. Access mode: https://doi.org/10.1016/j.eswa.2021.116197.
Lichtner-Bajjaoui A. Neural Networks and Applications [Electronic resource] : Advanced Mathematics Master Program, Universitat de Barcelona. Barcelona, 2020, 51 p. Access mode: https://diposit.ub.edu/dspace/bitstream/2445/180441/2/tfm_lichtner_bajjaoui_aisha.pdf/.
Nielsen M. Neural Networks and Deep Learning [Electronic resource] : textbook. Access mode: http://neuralnetworksanddeeplearning.com/.
Scornet E. Convolutional Neural Networks [Electronic resource]. Access mode: https://erwanscornet.github.io/teaching/CNN.pdf.
Kriesel D. A Brief Introduction to Neural Networks [Electronic resource]. Access mode: https://www.dkriesel.com/_media/science/neuronalenetzeen-zeta2-2col-dkrieselcom.pdf .
Haykin S. Neural Networks and Learning Machines [Electronic resource]. 3-rd ed. New York [et al.], 2009, 938 p. Access mode:
https://dai.fmph.uniba.sk/courses/NN/haykin.neuralnetworks.3ed.2009.pdf.
Roch S. Mathematical methods in data science (with Python) [Electronic resource]. Access mode: https://mmidstextbook.github.io/index.html.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 O. Y. Shalimov, O. O. Moskalchuk, O. M. Yevseienko

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.