MSC Software Corporation (MSC), part of Hexagon's Manufacturing Intelligence division, has announced Adams-ready VTD, combining vehicle dynamics and virtual test drive simulation to accelerate the development of Advanced Driver Assistance Systems (ADAS) and safe autonomous vehicles.
Passenger vehicles can already read traffic signs or detect passing traffic, but these ADAS 2+ functions depend on improved sensor fusion – the process of merging data from multiple sensors to near certainty so electronic systems can make safe decisions. At the same time, future autonomous driving algorithms require realistic test data for research and model training. Launched today, Adams-ready VTD accelerates development by simulating how a dynamic moving vehicle and its sensors will behave in complex road environments.
Using Adams simulation software, automotive manufacturers have validated vehicle dynamics models with road tests to understand a vehicle's movements and handling. By means of an open interface, it is now possible to 'drive' these vehicles in a simulated road environment provided by the Virtual Test Drive (VTD) platform.
ADAS systems must protect people, even in extreme circumstances at the physical limits of the vehicle. Adams-ready VTD simulates a vehicle's movements based on the road conditions (eg. slope, friction) to determine the vehicle behaviour (eg. will a car skid or roll) and evaluate the best course of action (eg. whether to change lane, or how much to brake).
Luca Castignani, Automotive Strategist, MSC Software commented: "Simulation must be accurate to centimetres, not metres, because a split second makes the difference in the most complex of circumstances. With Adams-ready VTD, we have brought software development and automotive engineering together so the industry can move from 'what should the vehicle do?' to 'can the vehicle cope with this command? and develop the next generation of safe vehicles'"
Accurate information from cameras, RADAR/LiDAR or satellite navigation is relied upon by ADAS systems to make safety-critical decisions. Now the blind spots caused by vehicle-road dynamics can be identified to determine which sensors to rely upon, when. For example, ensuring that a car driving over a speed bump is able to perceive a pedestrian, even if the camera vibrations prevent tracking.
Vehicle Original Equipment Manufacturers (OEMs) can evaluate how sensors function when subjected to vibrations or changes in orientation, so they can cost-effectively develop sensor fusion between road tests. Luca Castignani, Automotive Strategist, MSC Software explaned: 'The perception of a camera mounted on a truck cabin can change significantly relative to radar measurements during a braking manoeuvre – so what's the proximity to the car in front? We are enabling ADAS engineers to develop robust test cases like this to improve confidence in the decisions they make and develop accurate sensor fusion.'
Adams models can now be used directly in VTD 2019.1 using the open Functional Mock-up Interface (FMI), with flexible configuration to simulate any vehicle including trucks with more than 4 wheels and trailers. VTD guarantees synchronicity for robust simulation with Adams at real-time, or faster. Companies can now "bring their own AI" using an open interface to insert their driver-in-the-loop into VTD, then test and train their self-driving algorithms in a more accurate simulation with richer data.
VTD 2019.1 supports OpenDRIVE 1.5 and OpenSCENARIO 0.9 interoperability standards and features enhanced LiDAR simulation with more accurate GPU-accelerated ray-tracing and capabilities to simulate surface interaction. It is available for Red Hat Linux 7.3, with optional Docker containerisation of modules to aid the integration of VTD into customers virtual test environments and simplified deployment to the cloud or on-premise infrastructure.