METHODS AND ALGORITHMS OF BUILDING A 3D MATHEMATICAL MODEL OF THE SURROUNDING SPACE FOR AUTOMATIC LOCALIZATION OF A MOBILE OBJECT
DOI:
https://doi.org/10.15588/1607-3274-2025-3-3Keywords:
mathematical 3D model, localization method, SLAM method algorithms, position determination, mobile objectAbstract
Context. The task of automating the positioning of a mobile object in a closed space under the condition of its partial or complete autonomy is considered. The object of study is the process of automatic construction of a 3D model of the surrounding space.
Objective. The goal of the work is the develop an algorithm for creating a 3D model of the surrounding space for further
localization of a mobile object in conditions of its partial or complete autonomy.
Method. The results of the study of the problem of localization of a mobile object in space in real time are presented. The results of the analysis of existing methods and algorithms for creating mathematical models of the surrounding space are presented. Algorithms that are widely used to solve the problem of localization of a mobile object in space are described. A wide range of methods for constructing a mathematical model of the surrounding space has been researched – from methods that use the comparison of successive point clouds of the object of the surrounding space to methods that use a series of snapshots of characteristic points and comparison of information about them in different snapshots at points that are as similar as possible according to the parameter vector.
Results. The method for three-stage construction of a 3D model of the surrounding space is proposed for solving the problem of localization of a mobile object in a closed space.
Conclusions. The conducted experiments have confirmed the possibility of the proposed algorithm for three-stage construction
of a mathematical model of the environment to determine the position of a mobile object in space. The methods used in the algorithm allow obtaining information about the surrounding space, which allows localizing a mobile object in a closed space. Prospects for further research may lie in the integration of information flows about the position of the object from different devices, depending on the type of data acquisition, into a centralized information base for solving a wide range of tasks performed by automatic mobile objects (robots).
References
Azim A., Aycard O. Detection, classification and tracking of moving objects in a 3d environment, In Intelligent Vehicles Symposium (IV), IEEE, 2012, pp. 802–807. DOI: 10.1109/IVS.2012.6232303.
DosuzhIy O. O., Savchuk T. O. PIdhId do rekonstruktsIYi 3dmodelI zI stereo-zobrazhennya, MaterIali XLVI naukovotehnIchnoYi konferentsIYi pIdrozdIlIv VNTU, VInnitsya (2017).
Lade S. Pawale S., Patil A. GPU Accelerated Simulation of Scene Generation of 3D Photonic Mixer Device Camera, International Journal on Recent and Innovation Trends in Computing and Communication, 2023, No. 11, pp. 254–258. DOI: 10.17762/ijritcc.v11i9.8341.
Beder C., Bartczak B., Koch R. A comparison of pmd-cameras and stereo-vision for the task of surface reconstruction using patchlets, In Computer Vision and Pattern Recognition. CVPR’07. IEEE Conference on IEEE, 2007, pp. 1–8. DOI: 10.1109/CVPR.2007.383348.
Kuhnert K.-D., Langer M., Stommel M., Kolb A. Dynamic 3DVision, Vision Systems: Applications, 2007, pp. 311–334. DOI: 10.5772/4995.
Reulke R. Combination Of Distance Data With High Resolution Images, Image Engineering and Vision Metrology (IEVM), 2006, pp. 86–92.
Langmann B., Hartmann K., Loffeld O. Depth Camera Technology Comparison and Performance Evaluation, In Proceedings of the 1st International Conference on Pattern Recognition Applications and Methods, 2012, pp. 438–444. DOI: 10.5220/0003778304380444.
Karimpour O. Implementation of SLAM, Navigation, Obstacle Avoidance, and Path Planning of a Robust Mobile Robot Using 2D Laser Scanner (version 1). Toronto Metropolitan University, 2023. DOI: 10.32920/ryerson.14643939.v1.
Sasaki M., Tsuda Y., Matsushita K. Development of Autonomous Mobile Robot with 3DLidar Self-Localization Function Using Layout Map, Electronics, 2024, No. 13, P. 1082. DOI: 10.3390/electronics13061082.
Marek J., Chmelař P. Survey of Point Cloud Registration Methods and New Statistical Approach, Mathematics, 2023, No. 11, P. 3564. DOI: 10.3390/math11163564.
Hahnel D., Triebel R., Burgard W., Thrun S. Map building with mobile robots in dynamic environments, In 2003 IEEE Int. Conf. Robot. Autom. (Cat. No.03CH37422), 2003, Volume 2, pp. 1557–1563. DOI: 10.1109/ROBOT.2003.1241816.
Bay H., Ess A., Tuytelaars T., Gool L. V. Surf: Speeded Up Robust Features, Computer Vision and Image Understanding, 2008, No.10, pp. 346–359.
Cao L., Zhuang S., Tian S., Zhao Z., Fu C., Guo, Y. Wang D. A Global Structure and Adaptive Weight Aware ICP Algorithm for Image Registration, Remote Sensing, 2023, No. 15. 3185. – DOI: 10.3390/rs15123185.
Magnusson M., Lilienthal A., Duckett T. Scan Registration for Autonomous Mining Vehicles Using 3D-NDT, Journal of Field Robotics, 2007, Vol. 24(10), pp. 803–827. DOI: 10.1002/rob.20204.
Ulaş C., Temeltaş H. 3D Multi-Layered Normal Distribution Transform for Fast and Long Range Scan Matching, Journal of Intelligent & Robotic Systems, 2013, Vol. 71, pp. 85–108. DOI: 10.1007/s10846-012-9780-8.
Nechyporenko O., Korpan Y. Research of methods and technologies for determining the position of the mobile object in space, Technology Audit and Production Reserves, 2018, No. 6 (2(44)), pp. 4–10. DOI: 10.15587/2312-8372.2018.147861.
Engelhard N., Endres F., Hess J., Sturm J., Burgard W. Real-time 3D visual SLAM with a hand-held RGB-D camera, In Proceedings of the RGB-D Workshop on 3D Perception in Robotics at the European Robotics Forum. Vasteras, Sweden, 6–8 April 2011, pp. 1–15.18. Bedkowski J., Röhling T., Hoeller F., Shulz D., Schneider F. E. Benchmark of 6D slam (6D simultaneous localisation and mapping) algorithms with robotic mobile mapping systems, Foundations of Computing and Decision Sciences, 2017, Vol. 42(3), pp. 275–295. DOI: 10.1515/fcds-2017-0014.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Ya. W. Korpan, O. V. Nechyporenko, E. E. Fedorov, T. Yu. Utkina

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.