The number of Advanced Driver Assistance Systems (ADAS) in future vehicle generations will increase steadily in order to support drivers by means of comfort-, safety- and ecology-functions. Along with the ascent of ADAS functions, the challenge for developers to prove the safety and reliability of the overall system increases. The risk for people and test equipment involved in potentially dangerous real world test scenarios and the great efforts required to achieve reproducible results in real driving tests make an alternative test method necessary.
Therefore, Audi is working together with partners on the development of "Virtual Test Drive" (VTD) [VIR01], a modular, computer-based system for the integrated simulation of a virtual vehicle in a virtual environment. VTD supports engineers throughout the development, testing and validation process of ADAS. It contains reusable components, interfaces, models and tools which can be shared by different simulation variants (Software-, Hardware-, Model-, Driver- and Vehicle-in-the-loop) and applied at different stages of the development and testing process. The VTD simulation environment enables realistic closed-loop simulations to analyze the interaction between simulation components, such as sensor systems, actuators and a model of the vehicle environment as well as the assistance or safety functions under test.
This paper presents in particular a method for the analysis and validation of perceptive sensor models generating synthetic sensor data (e.g. Video Camera, RADAR, LIDAR, etc.) in VTD. The simulated perception sensor data is compared to real sensor data in a number of selected scenarios.
The process of generating synthetic sensor data with VTD using perception sensor models starts with the recording of a real vehicle test drive in a real world test scenario. GPS trajectory coordinates as well as vehicle state data and perception sensor data are recorded during defined approach and collision scenarios between the ego-vehicle and target objects. In a second step, these data is imported into VTD and synthetic sensor data is generated by feeding the recorded trajectory and vehicle state data through VTD sensor models. In a final step the synthetic sensor data is converted to the same format as the recorded real sensor data. The aim of this conversion step is to evaluate and validate the synthetic data by using the same toolchain as it is done for the real sensor data.
The novelty of the method presented in this paper is its reusability for different sensor models, functions and test scenarios and moreover the high level of automation reachable.