Administrator by China Associction for Science and Technology
Sponsored by China Society of Automotive Engineers
Published by AUTO FAN Magazine Co. Ltd.

Automotive Engineering ›› 2025, Vol. 47 ›› Issue (3): 440-448.doi: 10.19562/j.chinasae.qcgc.2025.03.006

Previous Articles    

A Method for Intelligent Driving Simulation Scenes Generation Based on Fusion of Virtual and Real Perception Data

Linguo Chai1,Xiangyan Liu1,Wei Shangguan1(),Yu Du2,Xiaohui Ba1,Baigen Cai1   

  1. 1.School of Automation and Intelligence,Beijing Jiaotong University,Beijing 100044
    2.China Intelligent and Connected Vehicles (Beijing) Research Institute Co. ,Ltd. ,Beijing 100176
  • Received:2024-02-26 Revised:2024-04-15 Online:2025-03-25 Published:2025-03-21
  • Contact: Wei Shangguan E-mail:wshg@bjtu.edu.cn

Abstract:

In order to achieve customizable design and high-fidelity intelligent driving simulation test perception data generation, an intelligent driving test scenario simulation architecture that integrates virtual and real perception data is established in this paper. By fusing simulated traffic subject perception data with real environment scene data, perception simulation data can be continuously generated with dangerous test scenarios as the target. On this basis, the RANSAC method is used to extract the position of obstacles in the real point cloud and determine the operating space constraints of simulated traffic subjects in the real environment scene at each moment. Then, in order to realize the interactive relationship between the behavior and position of the main vehicle and other traffic subjects in the test scenario, in the simulation software, simulation modeling and behavior design of the main vehicle and traffic subjects are conducted based on the real main vehicle sensor parameters and motion trajectories for output of continuous simulated traffic participant perception data. Finally, the mask replacement method and ray replacement strategy are used to perform virtual and real fusion on the image and point cloud data respectively, and the virtual and real fusion perception data of dangerous driving test scenes in different real environment scenarios are obtained. The simulation data is tested and verified. The results show that most scenarios in the real road collection data set have the ability to support simulation data injection. The injected simulated traffic subject behaviors can match the test scene requirements and have high authenticity. At the perceptual level, the injected simulated traffic subject and the real traffic subject have a similarity of 86.5% in the target detection algorithm confidence level. The proposed method can controllably inject simulated traffic subjects that meet test requirements into real environment scene data, and quickly and synchronously obtain virtual-real fusion images and point cloud data with high realism.

Key words: intelligent transportation, intelligent drive, scenario testing, fusion of virtual and real data, perception data simulation