Design and Preliminary Verification of an Intelligent Factory Unmanned Driving Auxiliary Decision-Making System Based on the Lightweight AI Model YOLOv5n
DOI:
https://doi.org/10.54097/10snfk47Keywords:
YOLOv5n, intelligent factory, autonomous driving, auxiliary decision-making, multi-sensor fusion.Abstract
This paper presents the design and preliminary validation of an intelligent factory unmanned driving auxiliary decision-making system utilizing the lightweight YOLOv5n model. The study aims to overcome challenges in real-time environmental perception and decision-making within computationally constrained edge environments. The proposed system integrates an optimized YOLOv5n architecture for high-speed object detection, a rule-based decision module supporting dynamic obstacle avoidance, and a multi-sensor fusion framework incorporating LiDAR, millimeter-wave radar, and camera inputs. Communication coordination via V2X technology further enhances situational awareness and inter-device collaboration. Experimental results demonstrate that the system achieves high detection accuracy and maintains a stable high frame rate under typical factory scenarios, effectively meeting real-time operational demands. Nevertheless, limitations are identified regarding model robustness under extreme conditions, edge hardware computational bottlenecks, and relatively simplistic decision logic. Future work will focus on adaptive learning mechanisms, deeper sensor fusion, and edge-specific optimizations, with the goal of strengthening generalization and deployment flexibility, which will further ensure the system's robustness and reliability for broad industrial adoption.
Downloads
References
[1] Shadravan A, Parsaei H R. Impacts of Industry 4.0 on Smart Manufacturing [C]. Proceedings of the International Conference on Industrial Engineering and Operations Management, 2023.
[2] Sun M, Lian Z, Yuan Y, et al. Joint Computation and Communication Resources Allocation in Industrial Internet of Things with Edge Computing [Z]. (2023 - 07 - 11).
[3] Santos R C C D M, Coelho M, Oliveira R. Real-time Object Detection Performance Analysis Using YOLOv7 on Edge Devices [J]. IEEE Latin America Transactions, 2024 (10): 22. DOI: 10.1109/TLA.2024.10705971.
[4] Bilik I. Comparative Analysis of Radar and Lidar Technologies for Automotive Applications [J]. IEEE Intelligent Transportation Systems Magazine, 2023. DOI: 10.1109/MITS.2022.3162886.
[5] Huang Y, Zhong Y, Zhong D, et al. Pepper-YOLO: a lightweight model for green pepper detection and picking point localization in complex environments [J]. Frontiers in Plant Science, 2024, 15. DOI: 10.3389/fpls.2024.1508258.
[6] Rahim L A, Abidin N A Z, Aminuddin R, et al. YOLOv5 model-based real-time recyclable waste detection and classification system [C]//Proceedings of the International Conference on Smart City Applications. Cham: Springer, 2024. DOI: 10.1007/978 - 3 - 031 - 53824 - 7_5.
[7] Qiao W, Guo H, Huang E, et al. Real-Time Detection of Slug Flow in Subsea Pipelines by Embedding a Yolo Object Detection Algorithm into Jetson Nano [J]. Journal of Marine Science and Engineering, 2023, 11 (9): 1658. DOI: 10.3390/jmse11091658.
[8] Li H, Zhuang X, Bao S, et al. SCD-YOLO: a lightweight vehicle target detection method based on improved YOLOv5n [J]. Journal of Electronic Imaging, 2024, 33 (2): 023041. DOI: 10.1117/1.JEI.33.2.023041.
[9] Geetha A S. Comparing YOLOv5 Variants for Vehicle Detection: A Performance Analysis [J]. 2024.
[10] Ye Y, Nie Z, Liu X, et al. ROS2 Real-time Performance Optimization and Evaluation [C]. 2023. Chinese Journal of Mechanical Engineering, 2023 (6).
[11] Zhu C, Qian J, Gao M. TensorRT Powered Model for Ultra-Fast Li-Ion Battery Capacity Prediction on Embedded Devices [J]. Energies, 2024, 17 (12): 2797. DOI: 10.3390/en17122797.
[12] NVIDIA introduces Jetson Orin NX embedded computing for artificial intelligence (AI) and unmanned vehicles [J]. Military & Aerospace Fiber Optics, 2023 (3): 17.
[13] Zendehdel N, Chen H, Leu M C. Real-time tool detection in smart manufacturing using You-Only-Look-Once (YOLO)v5 [J]. Manufacturing Letters, 2023, 35: 1052 - 1059. DOI: 10.1016/j.mfglet.2023.08.062.
[14] Liu H, Wan J, Zhou P, et al. Augmented Millimeter Wave Radar and Vision Fusion Simulator for Roadside Perception [J]. Electronics, 2024, 13 (14): 2729. DOI: 10.3390/electronics13142729.
[15] Qian X, Huang Y, Lou P, et al. A Multi-Sensor Fusion-Based Hybrid Guidance Technology for Following AGVs [C]. 2022. Transactions of the Chinese Society for Agricultural Machinery, 2022, 53 (1): 10. DOI: 10.6041/j.issn.1000 - 1298.2022.01.002.
[16] Ding L, Tang Y, Wang T, et al. A Cooperative Decision-Making Approach Based on a Soar Cognitive Architecture for Multi-Unmanned Vehicles [J]. Drones, 2024, 8 (4): 155. DOI: 10.3390/drones8040155.
[17] Sasithong P, Sanguanpuak T, Vanichchanunt P, et al. User Plane Function (UPF) Allocation for C-V2X Network Using Deep Reinforcement Learning [J]. IEEE Access, 2025, 13. DOI: 10.1109/ACCESS.2024.3524886.
[18] Zeng T, Li S, Song Q, et al. Lightweight tomato real-time detection method based on improved YOLO and mobile deployment [J]. Computers and Electronics in Agriculture, 2023. DOI: 10.1016/j.compag.2023.107625.
[19] Long K, Shi X, Li X. Physics-informed neural network for cross-dynamics vehicle trajectory stitching [J]. Transportation Research Part E: Logistics and Transportation Review, 2024, 192: 103799. DOI: 10.1016/j.tre.2024.103799.
[20] Weng Z, Liu K, Zheng Z. Cattle face detection method based on channel pruning YOLOv5 network and mobile deployment [J]. Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology, 2023, 45 (6): 10003 - 10020.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Highlights in Science, Engineering and Technology

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.







