Advertisement

Five challenges of AV design: from sensors and cameras to power and ADAS

What will it take for autonomous cars to go mainstream?

BY TIM GRAI
Director of ADAS and Autonomous Systems, Renesas Electronics America
www.renesas.com

0219_Feature_5-Challenges

How do we move from the car on the left to the car on the right? (Image: Renesas Electronics)

There has been a tremendous amount of innovation and advances in automotive technologies — from LiDAR, radar, cameras, and other sensors that help the vehicle to “see” a driving environment to the CPUs and SoCs that bring intelligence to the flood of data being generated in an autonomous vehicle. The automotive industry has made great strides in developing autonomous vehicle prototypes from consumer to commercial vehicles. The big question is: What will it take for autonomous cars to go mainstream?

Test vehicles provide a stepping stone to production-ready. But test vehicles, like the ones on the road today, do not provide a direct transition to autonomous cars ready to purchase from the dealer. A test vehicle is an environment that engineers can work on and learn from. It provides a feedback loop in which the engineers can continuously improve their systems and test out new ideas in a real-world setting. With test vehicles, things may not work all the time, and engineering teams learn and improve from those results to be production-ready.

When it comes to production-ready autonomous cars, the standard is higher. The vehicle must function continuously and safely at all times. This means that vehicles must not only be fail-safe — they must be fail-operational. And that is a great engineering challenge. Here are five questions facing automotive engineers as they navigate the path to production-ready autonomous vehicles.

How do data and testing requirements change for L3–L5, eyes-off functionality?
The standards for Level 3 to Level 5 (L3–L5) vehicles are rigorous. At these levels, the human is taken out of the equation; it is assumed that there is no one there as backup. The ISO 26262 Automotive Safety Integrity Level (ASIL) standards demand new capabilities and performance to meet the increased safety requirements. Today, the state of the art is functional safety, with vehicle systems designed to fail safely. With L3–L5 vehicles going forward, the systems must be able to fail operationally — that is to say, systems need to guarantee full or degraded operation of functions even when a failure occurs.

As such, systems must be tested to handle all the “what-if” scenarios. This means multiple scenarios with a combination of testing, including simulation and vehicle-level testing from emergency braking that needs to be ready all the time to vehicle controls that can navigate a vehicle to the closest safe stopping point in the event that the vehicle is disabled. We also need to determine how much data collection for training the systems is enough. The data that we collect can help with all the “what-if” scenarios.

How can we achieve the time synchronization that we need for distributed architectures?
In a dynamic changing environment, you can only fuse data or compare data (for diagnostic purposes) from diverse sources when you know that the data is from the same instant of time.

As a result, with increasing numbers of sensors, radars, cameras, and LIDARs distributed throughout the vehicle, it is very important to understand how the various data relates to one another in time. For example, the central control unit needs accurate data in time to gather all of the information required to make decisions on what it should do with low latency. Accurate time synchronization is also essential for the machine-learning models required for L3–L5 autonomous vehicles.

The challenge is that different data points can be attained from the various sources coming in at different speeds; however, time synchronization is critical to understanding the data. Take sensor fusion, for example. Low-level sensor fusion data that provides extensive details on the car’s environment requires more bandwidth and more complex time synchronization than high-level sensor fusion data.

Time synchronization standards will also vary by network protocol, adding more layers of complexity for automotive engineers. Ethernet Time-Sensitive Networking (TSN) and Ethernet Audio Video Bridging (AVB) provide a networking approach to address time synchronization for automotive control systems. GNSS is another approach, particularly for autonomous navigation systems.

How do we meet sensor and camera requirements?
Whereas L1–L2 vehicles have the human element, for L3–L5 vehicles, the cameras, radars, LiDARs, and other sensors must function as the “eyes” of the vehicle. As such, they need to address several factors, including a variety of data coming from multiple sources and operating in multiple environmental conditions.

Sensors and cameras must be able to detect multiple types of features in the vehicle’s surroundings with high accuracy to enable a consistent, comfortable, and safe driving experience. This includes all of the abnormal things that occur on the road that a human driver would see and act upon, e.g., an animal unexpectedly running onto the road, other cars not following traffic rules, potholes, or pedestrians crossing the street from unexpected places.

The vision system must also operate in weather conditions ranging from bright sun to rain or snow to fog. Vision systems — based on a combination of high-accuracy vision-processing and sensor-fusion technologies and high-performance hardware optimized for computer vision — offer the computing and software horsepower needed to support the more complex requirements demanded by a growing number of sensors and cameras in the vehicle.

How can we reduce power consumption while still providing the compute power needed to allow these systems to function?
Production volume drives cost pressures, which drives engineering innovation. When you talk about the computing challenge with autonomous vehicle design, it is a power discussion as well. With L3–L5, it is about as much computing as possible with low power consumption while meeting the required cost points.

One way to address this challenge is through optimization. For instance, software is where a lot of power is consumed, and that has a lot to do with how microprocessors, GPUs, and other chips are architected. The increased functional safety requirements pose additional challenges. Fail-operational requires triple redundancy. Fail-silent requires redundant systems. The vehicle cannot offer unlimited power. Heat is also a concern as thermal dissipation increases with power consumption.

Specialized hardware accelerators enable processors to meet specific application performance requirements at very low power. Understanding future system needs and designing hardware accelerators specifically for those requirements will also enable cost-effective and power-conscious autonomous systems.

How can existing ADAS investments be leveraged to move to mass production more efficiently and effectively?
With the shift to connected cars over the past several years, the automotive industry has already developed an extensive pool of expertise around developing and deploying active safety features into production.

Features such as advanced emergency braking, automatic cruise control, lane assist, cross traffic alert, surround view, traffic jam assist, and auto parking are becoming more commonplace for vehicles from entry level to the high end. Autonomous driving systems face similar challenges as they move from testing and prototypes into mainstream vehicles. The automotive industry can leverage the previous advanced driver-assistance system (ADAS) investments and lessons learned to overcome the current challenges and allow autonomous to scale and reach full autonomy.

Advertisement



Learn more about Renesas Electronics America

Leave a Reply