The implementation of advanced driver-assistance systems (ADAS) and autonomous-driving systems requires advanced sensors capable of providing real-time and detailed information about a vehicle’s surroundings when it is moving. Among the most suitable solutions to meet these requirements are LiDAR and radar sensors, whose use in the automotive sector is growing at a constant pace.
LiDAR market growth
Due to its ability to scan and detect objects in the environment, as well as track their movement, LiDAR technology has great potential in automotive applications. The most relevant applications for LiDAR sensors include ADAS, lane-departure warning systems, collision warning systems, and lane-keep assistance systems. Because these functionalities are associated with automotive safety, it is expected they will boost LiDAR growth as a result of regulations imposed by many countries to reduce road accidents and traffic congestion.
Even though the latest-generation vehicles are much smarter than before, we are still far from achieving full automation. According to the Society of Automotive Engineers (SAE), there are six levels of autonomous driving, as defined below:
- Level 0: No automation
- Level 1: Driver assistance
- Level 2: Partial automation
- Level 3: Conditional automation
- Level 4: High automation
- Level 5: Full automation
As of today, LiDAR sensors have already been installed in some L2 and even L3 vehicles. Starting from Level 3 autonomous driving, most carmakers agree on the need to use LiDAR sensors. Figure 1 shows a Mercedes-Benz S-Class vehicle capable of Level 3 autonomous driving with the LiDAR sensor positioned behind the radiator grille.
Level 4 autonomous driving (currently subject to regulatory approval) is expected to enter North American and European markets in the next few years for some applications, after Level 3 vehicles reach mass production. This suggests that LiDAR has significant room for growth.
The expansion of the global automotive LiDAR market is partly limited by the high costs of this system, which includes cutting-edge hardware (sensors and scanners), plus high-performance processors and post-processing software. Although a significant number of manufacturers are expected to enter the market over the next five years, the price of LiDAR systems is high today, justifying its current use only in high-end vehicles.
In addition, adverse weather conditions, such as dense fog, heavy snow, rain, or direct sunlight, affect the overall performance of automotive LiDAR devices.
Radar market growth
The increasing introduction of on-board safety features is driving the market demand for automotive short- and medium-range radar solutions. Lower prices and the requirement to install more than two radars per vehicle to improve safety are projected to significantly speed up the segment’s growth in the upcoming years.
Additionally, stringent regulations and programs to improve car safety are being implemented by governments all over the world. Several research reports, including from the National Highway Traffic Safety Administration (NHTSA), find that most traffic fatalities are the result of driver error.
One key factor that is predicted to increase the automotive radar market is the rise in safety awareness among vehicle users and new regulations established by international regulatory organizations to safeguard both passengers and vehicles by reducing accidents.
An example of these initiatives is the European New Car Assessment Programme (Euro NCAP), which, by assessing the safety of new cars, triggers relevant modifications in the vehicle to improve safety. The main goals are to ensure safety, decrease costs, and improve the safety of transportation, resulting in increased spending for radar technology and its applications.
LiDAR and radar trends
The usage of LiDAR and radar sensors in vehicles is continually growing, with volumes of units installed set to increase significantly in the coming years. In addition to passive and active safety, which remains the essential and basic requirement of any automotive solution, other key factors are bound to influence the increasing introduction of LiDAR and radar solutions in standard vehicles.
The key factors driving growth and the technology evolution of LiDAR and radar systems in the automotive sector are basically:
- Autonomous driving
- Data fusion
- 4D technology
Autonomous driving
The desire for autonomous vehicles is rising, and with more technological developments making this a reality, more automakers are investing in autonomous-vehicle programs. To offer a fully driverless experience, these cars will be entirely operated by sensors and electronics.
For autonomous vehicles, LiDAR sensors are necessary, as they are key parts of crucial systems, including blind-spot monitoring, lane-keeping assistance, lane-departure warning, and collision warning. LiDAR can gather enough data about the area around the car to allow the on-board computer to take control from the driver. LiDAR sensors generate extended 3D maps, which enable 360˚ vision and provide accurate information for autonomous navigation and object detection.
Due to their smaller form factor, lower cost, and higher reliability, solid-state LiDARs are expected to be preferred by carmakers, as compared with traditional flash LiDAR.
The trend represented by self-driving cars is further strengthened by the rapid development of projects like robotaxis, which provide either fully autonomous or semi-autonomous driving, heavily depending on LiDAR sensors for their operations. An example is the Hyundai IONIQ 5 robotaxi project (see Figure 2), which will debut in Las Vegas in 2023.
The IONIQ 5 is an autonomous SAE Level 4 vehicle that can operate safely without a driver. Hyundai Motor Company partnered with Motional, pioneers and leaders in driverless technology, equipping the IONIQ 5 with dedicated hardware and software that ensure safe and secure driverless operation.
Hyundai has equipped the 100% electric IONIQ 5 crossover with software and hardware that include a combination of cameras, radar, and LiDAR, ensuring autonomous, safe, and secure operation in various driving conditions. The integration of over 30 sensors provides reliable 360˚ viewing, high-resolution imaging, and long-range object detection.
Data fusion
Data fusion is a general concept applicable to different contexts in which a system has to make decisions based on information coming from a variety of sensors. By linking together and correlating the information acquired from multiple physical sensors, it is possible to obtain aggregate data that helps reduce the degree of uncertainty relating to a real-world scenario.
A practical example of data fusion in the automotive field is the integration of visual sensors (cameras), radar, and LiDAR to obtain more detailed information about a vehicle’s surroundings when in motion.
LiDAR, for instance, provides several benefits over systems relying on cameras. Compared with cameras, which provide only a flat, 2D representation of the surroundings, LiDAR provides a perspective of the world in 3D that is substantially more accurate and lifelike.
In addition, LiDAR is unaffected by low light levels and may continue to operate normally because it produces its own light. Unlike camera-based systems, which are not always reliable when used at night because they need sufficient light to operate at their best, LiDAR can see objects equally as well at night as it can during the day because of its exceptional night vision.
Radar sensors, on the other hand, are successful in identifying objects very close to the vehicle, and they can determine how far away these objects are and at what speed they are moving. That’s the reason why carmakers use radars for ADAS systems like parking sensors, blind-spot monitors, and adaptive cruise control.
Unlike LiDAR, radar offers excellent performance in adverse weather conditions, including rain, fog, and snow. The main limitation of automotive radar sensors is in detecting with high accuracy the position, size, and shape of an object. This is necessary for self-driving cars, which require high-accuracy detection of objects like bikes, animals, and pedestrians.
A combination of vision sensors (cameras), radar, and LiDAR offers a level of reliability higher than any other solution. If a software platform developed with artificial-intelligence algorithms is added to the hardware sensors, even higher performance can be achieved even in the most critical operating conditions, or in the event of degradation in the performance of one or more sensors.
4D technology
Current LiDAR sensors can provide a 3D representation of a vehicle’s surroundings when it is moving thanks to the repeated acquisition of images with different angle values. Radar sensors, on the other hand, provide measurements that, when properly processed, determine the range and relative speed of objects or other vehicles.
A recent trend in both LiDAR and automotive radar sensors is the addition of the fourth dimension, which provides extremely detailed information on both 3D position and velocity of objects located in the vehicle’s surrounding area.
A state-of-the-art 4D radar system can detect an object’s position in range, azimuth, elevation, and relative speed, giving more precise information than first-generation radar systems, which only collected speed, range, and angle-of-arrival data. In terms of performance, 4D radar sensors are comparable to LiDAR systems, which are far more expensive and less effective in bad visibility circumstances, like rain and fog.
The potential of commercial radars that have been successfully used in the market is unlocked by 4D’s increased resolution and sensitivity, which provide all-weather sub-degree horizontal and elevation spatial resolution for long-range applications across a wide field of view. In all kinds of weather and environmental situations, 4D radar technology can detect when and how quickly a vehicle is moving, unlike conventional cameras or more sophisticated LiDARs.
Several chipmakers, including Steradian and SilC, have introduced 4D imaging systems.
Steradian Semiconductors, a Renesas company, has developed a real-time 4D imaging radar system based on a proprietary high-performance CMOS radar transceiver IC, which achieves high spatial resolutions by cascading four transceivers. The resulting MIMO configuration produces 256 virtual channels, which are used to sample both horizontal and vertical space ahead. This helps long-range applications in detecting targets farther away than 250 meters. The radar system can monitor independently moving objects even in cluttered environments like high traffic thanks to specialized object-tracking and data-association algorithms.
SiLC Technologies, a California-based startup, has developed a LiDAR technology that exploits a coherent sensor to enable 4D vision in automotive, as well as in robotics and industrial applications. Unlike today’s 3D LiDARs, which rely on high-power lasers using a time-of-flight architecture at 905-nm wavelength, SiLC’s solution is based on frequency-modulated continuous-wave (FMCW) technology operating at 1550-nm wavelength, thus addressing eye-safety regulatory concerns and enabling volume deployment with minimal interference when using multiple LiDARs.
SiLC has also developed the Eyeonic Vision Chip (see Figure 3), the first commercially available single-chip FMCW LiDAR sensor, integrating all photonics functions needed to enable a coherent vision sensor. These include an ultra-low–linewidth laser, a semiconductor optical amplifier, germanium detectors, and meters of optical circuits.
The Eyeonic Vision Sensor recently demonstrated the industry’s longest detection range beyond 1,000 meters, providing not only depth information but also velocity and polarization intensity data.
ADAS systems and autonomous driving require sensors with higher performance, accuracy, and efficiency. In most cases, automotive cameras and radar are sufficient to cover current requirements; however, the highest levels of autonomous driving will require LiDAR sensors. The integration of these three types of sensors paves the way for scenarios in which data fusion and sensor redundancy can make a difference in automotive safety. The most recent and promising advances in radar and LiDAR sensors will allow for detailed information on both 3D position and speed of objects, enabling next-generation vehicles to offer a safer and more comfortable ride.