Administrator by China Associction for Science and Technology
Sponsored by China Society of Automotive Engineers
Published by AUTO FAN Magazine Co. Ltd.

Automotive Engineering ›› 2025, Vol. 47 ›› Issue (9): 1712-1720.doi: 10.19562/j.chinasae.qcgc.2025.09.007

Previous Articles    

LiDAR-Visual Fusion SLAM System for High-Dynamic Environment

Yunshui Zhou1,Chengyu Gao1,Shengjie Huang1,Runbang Zhang1,Xin Chen2,Yougang Bian1,Hongmao Qin1()   

  1. 1.College of Mechanical and Vehicle Engineering,Hunan University,State Key Laboratory of Advanced Design and Manufacturing Technology for Vehicle,Changsha 410082
    2.Beijing Automotive Technology Center,Beijing 100021
  • Received:2024-12-25 Revised:2025-04-16 Online:2025-09-25 Published:2025-09-19
  • Contact: Hongmao Qin E-mail:qinhongmao@vip.sina.com

Abstract:

Accurate localization and mapping are critical for autonomous driving systems. However, single-sensor Simultaneous Localization and Mapping (SLAM) systems often struggle to operate reliably across different environment, particularly in highly dynamic scenes where dynamic obstacles can degrade accuracy or even cause system failure. Therefore, in this paper a LiDAR-Visual fusion SLAM framework tailored for high-precision mapping and localization problems in dynamic environment is proposed. Firstly, an odometry method that fuses sparse LiDAR point clouds with dense image data is designed, leveraging the high-precision ranging capabilities of LiDAR and the rich information provided by images to enhance odometry accuracy. To address challenges in highly dynamic scenes, based on a real-time image semantic segmentation network, BiSeNetV2, combined with motion feature detection techniques based on inter-frame and multi-frame sequences, the efficient and accurate identification of dynamic points among the 3D feature points obtained from the LiDAR-Visual fusion is realized, which are removed from the map to mitigate the influence of dynamic obstacles. Tests are carried out on the nuScenes autonomous driving dataset, and the results show significant improvement of the proposed system in accuracy and robustness of localization and mapping in dynamic environment.

Key words: SLAM, image, LiDAR, multi-sensor fusion, high-dynamic environment