Advertisement

LiDAR and radar advance for ADAS

New advances in LiDAR and radar technologies meet safety-critical requirements for ADAS and autonomous-driving applications.

Advanced driver-assistance systems (ADAS) and autonomous-driving technologies rely heavily on sensors, including LiDAR, radar and cameras. Recently, there have been significant advancements in these technology areas as more vehicles become equipped with advanced sensor systems.

In this article, we will discuss recent trends in automotive LiDAR, radar and cameras and their impact on the development of ADAS and autonomous driving.

LiDAR sensors and solutions

LiDAR has become paramount in developing ADAS and autonomous driving because it can create a 3D map of the surrounding environment, allowing it to detect the spatial position, or even the velocity, of the surrounding objects.

A Counterpoint Technology Market Research report, “Global Autonomous Passenger Vehicle Market 2019–2030,” predicts that 10% of new cars produced worldwide by 2025 will be capable of Level 3 driving. Level 4 vehicles will initially enter developed countries like the U.S. and Europe, though these regions will have a higher proportion of Level 3 vehicles (subject to regulatory permission). Counterpoint suggests that LiDAR has substantial development potential at Level 3 and above and will simultaneously enter the mass market by 2030.

According to Counterpoint, the LiDAR market is expected to grow at a compound annual growth rate of 65.9%, reaching $15 billion and over 100 million units shipped by 2030.

One of the key trends in LiDAR development is the push toward 4D LiDAR. In addition to 3D positioning, this technology measures the distance to objects using reflected laser light and plots the instant velocity as a fourth dimension. This allows for a more detailed understanding of the environment and the movement of objects within it, a crucial factor for autonomous devices like vehicles and robots to make precise and accurate decisions.

Aeva Inc., a company specializing in 4D LiDAR sensors, has recently announced its next-generation 4D LiDAR sensor (Figure 1). Based on the frequency-modulated continuous-wave (FMCW) technique, the new Aeries II 4D LiDAR sensor directly identifies each point’s instant velocity and precise 3D position, resulting in groundbreaking sensing and perception capabilities.

FMCW technology is intrinsically immune to interference from direct sunlight, other LiDAR sensors and retroreflector ghosting. These sensors offer distinct benefits over traditional time-of-flight 3D LiDAR sensors by using the additional dimension of velocity data, including:

  • Extended range performance, allowing for detection, classification and tracking of dynamic items like cars, bicycles and pedestrians
  • Ultra resolution, a real-time camera-level image with up to 20× the resolution of traditional time-of-flight LiDAR sensors
  • Finding road hazards by spotting small items on the road more reliably and up to twice as far away as traditional LiDAR sensors
  • Providing precise positioning and navigation without extra sensors like IMUs or GPS with 4D localization, which estimates the vehicle’s motion in real time with six degrees of freedom
Aeva’s Aeries II 4D LiDAR sensor.

Figure 1: Aeva’s Aeries II 4D LiDAR sensor (Source: Aeva Inc.)

Aeries II is the first sensor on the market to incorporate LiDAR-on-chip technology, integrating all essential sensor components, including transmitters, receivers and optics in a small silicon photonics module. Thanks to the high integration and elimination of fiber optics, the manufacturing process is highly automated.

Another major trend in LiDAR technology is leveraging the flexibility of the software component by adding artificial-intelligence algorithms to hardware sensors, thus enhancing the system’s performance and widening its applications.

AEye Inc. has developed the 4Sight Intelligent Sensing Platform, a software-defined LiDAR solution that leverages AI to enable dynamic transportation and mobility applications that save lives. Unlike traditional passive LiDAR systems, which cannot account for changing conditions or balance competing priorities, AEye’s adaptive LiDAR offers high-performance, software-definable solutions that can be tailored to satisfy the various performance and functional demands of any autonomous application. The 4Sight system is flexible and adaptable to a variety of markets, applications, use cases, settings and weather conditions.

Built on the AEye’s 4Sight Intelligent Sensing Platform, Continental AG’s HRL131 is on track to be the industry’s first high-resolution, solid-state, long-range LiDAR sensor to enter series production in the automotive market. HRL131 is a crucial part of Continental’s complete stack automotive-grade system for Level 2 to Level 4 automated and autonomous-driving applications, completing the company’s sensor suite, including radar, cameras and ultrasonic technology. An example of the HRL131’s scan pattern configuration is shown in Figure 2.

HRL scan pattern configurations.

Figure 2: HRL scan pattern configurations (Source: AEye Inc.)

Radar imaging

Using radio waves to detect and locate objects in the environment, radar is often employed with other sensors, including cameras and LiDAR, to provide a holistic perception system. One of the current trends in radar development is improving resolution and range. This is important, as it allows for more detailed mapping of the environment, providing more precise information for ADAS and autonomous-driving systems.

Israeli company Arbe recently proposed Lynx, a revolutionary 360° radar for self-driving vehicles. Until now, 360° tracking has tended to rely on imaging. The technology developed by Arbe uses a suite of radar to provide what it claims is the first integrated AI-based analysis of a vehicle’s entire surroundings.

Radar is essential in autonomous driving, as it can identify, classify and monitor objects. By acquiring and processing data in real time, a complete map of the free space around the vehicle can be created, as well as provide analysis of potential hazards. This functionality is provided by the radars “overlapping,” which allows for the smooth tracking of objects of interest from one unit to the next while simultaneously validating their location using two separate perception algorithms. As a result, several scenarios are handled better, such as driving in congested urban traffic, securely merging onto highways and keeping an eye out for empty lanes for potential emergency maneuvers.

Like LiDAR development, there is a trend to provide radar solutions that are highly integrated on a single chip. An example is NXP Semiconductors’ new SAF85xx, the industry-first 28-nm RFCMOS radar one-chip IC family designed for next-generation ADAS and autonomous-driving systems.

This one-chip family combines high-performance radar sensing and processing technologies into a single device, offering Tier 1s and OEMs new flexibility in addressing short-, medium- and long-range radar applications to serve increasingly challenging current and future New Car Assessment Program (NCAP) safety requirements (see Figure 3).

The SAF85xx integrates a radar transceiver designed to operate from 76 GHz to 81 GHz, covering the full automotive radar frequency band with a radar microprocessor based on Arm Cortex A53 and M7 cores, plus SRAM. Offering twice the RF performance and accelerating radar signal processing by up to 40% compared with the previous chip generation, the one-chip family enables 4D sensing for corner and front radar. This serves critical safety ADAS applications, such as automated emergency braking, adaptive cruise control, blind-spot monitoring, cross-traffic alert and automated parking.

NXP's SAF85xx radar sensor supports short-, medium- and long-range radar applications.

Figure 3: The SAF85xx radar sensor supports short-, medium- and long-range radar applications. (Source: NXP Semiconductors)

Cameras

Providing visual information about the surrounding environment, cameras are used to detect lane markings, traffic lights and other vehicles, becoming an essential part of ADAS and autonomous driving.

Besides the increasing move toward higher-resolution sensors, one of the key trends in camera development is integrating different types of sensors, also known as sensor fusion. A critical development area has been integrating sensor systems, including LiDAR, radar and cameras, in scenarios characterized by high levels of uncertainty or noise.

Sensor fusion combines data from multiple sensors to create a more comprehensive understanding of the environment. Sensor fusion is critical for ADAS and autonomous driving, as it enables a more accurate portrayal of the surrounding environment, leading to better decision-making by the vehicle.

Advertisement

Leave a Reply