Jeff's MCAD Blogging
Jeffrey Rowe has more than 40 years of experience in all aspects of industrial design, mechanical engineering, and manufacturing. On the publishing side, he has written well over 1,000 articles for CAD, CAM, CAE, and other technical publications, as well as consulting in many capacities in the … More »
Siemens Making Sense of Autonomous Vehicle Sensors Through Digital Simulation – Part 2
April 5th, 2018 by Jeff Rowe
Editor’s Note: This is the second part of a two-part article on Siemens’ simulation efforts aimed at making autonomous vehicles safer while making verification and validation processes more comprehensive and efficient.
Last week, Siemens introduced a solution for the development of autonomous driving systems as part of its Simcenter portfolio that minimizes the need for extensive physical prototyping while dramatically reducing the number of logged test miles necessary to demonstrate the safety of autonomous vehicles.
This computing and simulation platform is aimed at accelerating the validation and verification of autonomous cars.
The new solution integrates autonomous driving technologies from recent Siemens acquisitions Mentor Graphics and TASS International. TASS’ PreScan simulation environment produces highly realistic, physics-based simulated raw sensor data for an unlimited number of potential driving scenarios, traffic situations and other parameters.
The data from PreScan’s simulated LiDAR, radar and camera sensors is then fed into Mentor’s DRS360 platform, where it is fused in real time to create a high-resolution model of the vehicle’s environment and driving conditions. Customers can then leverage the DRS360 platform’s superior perception resolution and high-performance processing to test and refine proprietary algorithms for critical tasks such as object recognition, driving policy and more.
TASS International Acquisition
Last summer, Siemens acquired TASS International, a provider of simulation software, plus engineering and test services aimed primarily at the automotive industry, and focused on autonomous driving, integrated safety, advanced driver assistance systems (ADAS), and tire modeling. The company developed a family of solutions that strengthen Siemens’ PLM software portfolio, and add to its position as a leading supplier of “systems driven product development” offerings for the automotive industry.
The video below shows testing of a complete vehicle in a controlled hardware-in-the-loop environment for validating Automatic Emergency Braking (AEB) systems.
TASS Vehicle Hardware-in-the-Loop AEB Testing
TASS International is focused on automated driving solutions and integrated (active, passive) safety, primarily for the automotive industry. With its PreScan software, car manufacturers, suppliers and government agencies can simulate complex traffic scenarios and virtually validate automated driving and advanced driver assistance systems.
TASS International’s simulation software is combined with Siemens’ Simcenter portfolio of advanced simulation offerings, and its electronic design automation (EDA) solutions from the recently acquired Mentor Graphics. The combination provides an integrated solution to frontload the verification and validation of ADAS and autonomous driving systems.
Siemens acquired 100 percent of the share capital of TASS International and integrated the business into its PLM Software business unit, which is part of its Digital Factory Division.
TASS PreScan Overview
TASS PreScan is a physics-based simulation platform that is used in the automotive industry for developing Advanced Driver Assistance Systems (ADAS) that are based on sensor technologies such as radar, laser/LiDAR, camera and GPS. PreScan is also used for designing and evaluating vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication applications as well as autonomous driving applications.
The PreScan platform consists of a GUI-based preprocessor to define scenarios and a run-time environment to execute them. The engineer’s prime interface for making and testing algorithms includes MATLAB and Simulink. PreScan can be used from model-based controller design (MIL) to real-time tests with software-in-the-loop (SIL) and hardware-in-the-loop (HIL) systems. PreScan can operate in open-loop & closed-loop, and offline & online mode. It is an open software platform with flexible interfaces for linking to third party vehicle dynamics model (such as CarSIM and dSPACE ASM) and third party HIL simulators/hardware (such as ETAS, dSPACE, and Vector).
The platform works with the following steps:
A dedicated pre-processor (GUI) allows users to build and modify traffic scenarios within minutes using a database of road sections, infrastructure components (trees, buildings, traffic signs), actors (cars, trucks, bikes and pedestrians), weather conditions (such as rain, snow and fog) and light sources (such as the sun, headlights and lampposts). Representations of real roads can be quickly made by reading in information from OpenStreetMap, Google Earth, Google 3D Warehouse and/or a GPS navigation device.
Vehicle models can be equipped with different sensor types, including radar, laser, camera, ultrasonic, infrared, GPS and antennas for vehicle-to-X (V2X) communication. Sensor design and benchmarking is facilitated by easy exchange and modification of sensor type and sensor characteristics.
Add control system
A Matlab/Simulink interface enables users to design and verify algorithms for data processing, sensor fusion, decision making and control as well as the re-use of existing Simulink models such as vehicle dynamics models from CarSim, Dyna4, or ASM.
A 3D visualization viewer allows users to analyze the results of the experiment. It provides multiple viewpoints, intuitive navigation controls, and picture and movie generation capabilities. Also, interfaces with ControlDesk and LabView can be used to automatically run an experiment batch of scenarios as well as to run hardware-in-the-loop simulations.
Mentor’s Open Platform for ADAS And Autonomous Vehicles
To address the hardware side, about a year ago, Mentor, an acquired Siemens business, introduced the DRS360 platform – a comprehensive automated driving solution featuring breakthrough technology that captures, fuses and utilizes raw data in real time from a wide range of sensing modalities, including radar, LiDAR, vision and other sensors.
The DRS360 platform delivers dramatic improvements in latency reduction, sensing accuracy and overall system efficiency required for SAE Level 5 autonomous vehicles (fully automated).
For autonomous driving platforms, DRS360 directly transmits unfiltered information from all system sensors to a central processing unit, where raw sensor data is fused in real time at all levels. The platform employs innovative “raw data sensors”, which are unburdened by the power, cost and size penalties of microcontrollers and related processing in the sensor nodes, in partnership with leading sensor suppliers.
DRS360 consists of Xilinx Zynq UltraScale+ MPSoC FPGAs, with advanced neural networking algorithms for machine learning, and a host of integration services built on system support package utilizing Mentor IP. The platform uses “raw data” approach that eliminates processing at sensor nodes from all sensor nodes. The results are reduced cost and complexity for OEMs, suppliers, and autonomous driving developers working on systems that can see and act at high resolution.
The platform accommodates a wide array of sensors from leading suppliers and customer choice extends to the use of x86 and Arm-based (Systems on Chips) SoCs for delivering key autonomous driving functionality such as sensor fusion and event detection, semantic perception of objects, applications such as situational awareness and path planning, and actuator control.
For an introduction to the DRS360 platform for autonomous driving, click the link.
Eliminating pre-processing microcontrollers from all system sensor nodes enables a broad array of advantages, including real-time performance, significant reductions in system cost and complexity, and access to all captured sensor data for the highest resolution model of a vehicle’s environment and driving conditions.
The platform’s streamlined data transport architecture further lowers system latency by minimizing physical bus structures, hardware interfaces and complex, time-triggered Ethernet backbones. This architecture also enables situation-adaptive redundancy and dynamic resolution by using centralized, unfiltered sensor data to ensure enhanced accuracy and reliability.
The solution’s optimized signal processing software, advanced algorithms, and compute-optimized neural networks for machine learning run on a seamlessly integrated, automotive-grade platform.
It’s no secret that robotic simulations have been around for quite some time. However, they often simplify the simulated environment and ignore many aspects that are critical to automotive simulations, such as the infrared and radar response of objects and sensors that are becoming common in autonomous vehicles. Likewise, resolution and quality of feedback within these systems often is insufficient to match the requirements of these vehicles.
The Siemens simulation platform is at the forefront of addressing and ultimately resolving these concerns. The Siemens platform seems poised to safely propel autonomous vehicles into the future.