Open side-bar Menu
 MCADCafe Editorial

Posts Tagged ‘autonomous vehicles’

Siemens Making Sense of Autonomous Vehicle Sensors Through Digital Simulation – Part 2

Thursday, April 5th, 2018

Editor’s Note: This is the second part of a two-part article on Siemens’ simulation efforts aimed at making autonomous vehicles safer while making verification and validation processes more comprehensive and efficient.

Last week, Siemens introduced a solution for the development of autonomous driving systems as part of its Simcenter portfolio that minimizes the need for extensive physical prototyping while dramatically reducing the number of logged test miles necessary to demonstrate the safety of autonomous vehicles.

This computing and simulation platform is aimed at accelerating the validation and verification of autonomous cars.

The new solution integrates autonomous driving technologies from recent Siemens acquisitions Mentor Graphics and TASS International. TASS’ PreScan simulation environment produces highly realistic, physics-based simulated raw sensor data for an unlimited number of potential driving scenarios, traffic situations and other parameters.

The data from PreScan’s simulated LiDAR, radar and camera sensors is then fed into Mentor’s DRS360 platform, where it is fused in real time to create a high-resolution model of the vehicle’s environment and driving conditions. Customers can then leverage the DRS360 platform’s superior perception resolution and high-performance processing to test and refine proprietary algorithms for critical tasks such as object recognition, driving policy and more.

 

TASS International Acquisition

Last summer, Siemens acquired TASS International, a provider of simulation software, plus engineering and test services aimed primarily at the automotive industry, and focused on autonomous driving, integrated safety, advanced driver assistance systems (ADAS), and tire modeling. The company developed a family of solutions that strengthen Siemens’ PLM software portfolio, and add to its position as a leading supplier of “systems driven product development” offerings for the automotive industry.

The video below shows testing of a complete vehicle in a controlled hardware-in-the-loop environment for validating Automatic Emergency Braking (AEB) systems.

 

TASS Vehicle Hardware-in-the-Loop AEB Testing

TASS International is focused on automated driving solutions and integrated (active, passive) safety, primarily for the automotive industry. With its PreScan software, car manufacturers, suppliers and government agencies can simulate complex traffic scenarios and virtually validate automated driving and advanced driver assistance systems.

(more…)

Siemens Making Sense of Autonomous Vehicle Sensors Through Digital Simulation – Part 1

Thursday, March 29th, 2018

Although they hold much promise, this has not exactly been a stellar time lately for self-driving/autonomous vehicles. As a matter of fact, recent events have cast a dark cloud over them.

Testing them on the road is, of course, essential, but I’ve often wondered if digital simulation could be used more to maximize safety and efficiency with less road testing required.

As it turns out, this very thing, simulation, is finally being performed quite extensively.

This week, Siemens introduced a breakthrough solution for the development of autonomous driving systems as an addition its Simcenter portfolio that minimizes the need for extensive physical prototyping while dramatically reducing the number of logged test miles necessary to demonstrate the safety of autonomous vehicles.

In a nutshell, this computing and simulation platform is aimed at accelerating the validation and verification of autonomous cars.

Siemens PLM Software Driving Simulator

According to the findings of a report issued by the Rand Corporation, autonomous vehicle prototypes would have to be driven hundreds of millions of miles, and in some cases hundreds of billions of miles, over the course of several decades to demonstrate their reliability in terms of fatalities and injuries – an outcome the authors deemed inconsistent with the near-term commercial viability of self-driving cars. For possible solutions to these challenges, the researchers pointed to innovative testing methods such as advanced simulation technologies.

Leveraging advanced, physics-based simulation and innovative sensor data processing technologies, the new Siemens solution is designed to help automakers and their suppliers address this industry challenge with the potential to shave years off the development, verification and validation of self-driving cars.

(more…)

NVIDIA’s AI Computer Drives AVs

Thursday, October 19th, 2017

This week NVIDIA unveiled what it claims to be the world’s first artificial intelligence computer designed specifically to “drive” fully autonomous vehicles.

The new system, codenamed Pegasus, brings the NVIDIA® DRIVE™ PX AI computing platform for handling Level 5 driverless vehicles (Level 5 is ”steering wheel optional.” In other words, no human intervention is required, for example, a robotic taxi). NVIDIA DRIVE PX Pegasus can perform over 320 trillion operations per second — more than 10x the performance of its predecessor, NVIDIA DRIVE PX 2.

NVIDIA DRIVE PX Pegasus is intended to help make a new class of vehicles possible that can operate without a driver — fully autonomous vehicles without steering wheels, pedals, or mirrors, and interiors that feel more like a living room or office than a vehicle. They will arrive on demand to safely take passengers to their destinations, bringing mobility to everyone, including the elderly and disabled.

One of the driving forces behind autonomous vehicles is to recapture millions of hours of lost time that could be used by “drivers” (really passengers) to work, play, eat or sleep on their daily commutes. Theoretically, countless lives could be saved by vehicles that are never fatigued, impaired, or distracted — increasing road safety, reducing congestion, and possibly freeing up land currently used for parking lots.

Of the 225 partners developing on the NVIDIA DRIVE PX platform, more than 25 are developing fully autonomous robotaxis using NVIDIA CUDA GPUs. Today, their trunks resemble small data centers, loaded with racks of computers with server-class NVIDIA GPUs running deep learning, computer vision and parallel computing algorithms. Their size, power demands and cost make them impractical for production vehicles.

NVIDIA AI Vehicle Demonstration

The computational requirements of robotaxis are enormous — perceiving the world through high-resolution, 360-degree surround cameras and lidars, localizing the vehicle within centimeter accuracy, tracking vehicles and people around the car, and planning a safe and comfortable path to the destination. All this processing must be done with multiple levels of redundancy to ensure the highest level of safety. The computing demands of driverless vehicles are easily 50 to 100 times more intensive than the most advanced cars today with human drivers.

(more…)




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise