Advertisement

AutoSens Europe 2024: sensors, demos and partnerships

The AutoSens conference showcased a range of sensor demonstrations and partnerships that will help shape the next generation of vehicles.

AutoSens Europe 2024, held in combination with InCabin Europe 2024, showcased the latest innovations in automotive sensor and camera systems for vehicles. The latest solutions in advanced imaging is primarily driven by the automotive industry’s move to more autonomous capabilities that require improvements in sensing and processing power as well as the need for greater safety.

These new sensing and imaging solutions enhance a range of applications from advanced-driver assistance systems (ADAS) and autonomous driving (AD) to automatic emergency braking (AEB) and in-cabin monitoring.

Automotive suppliers also announced several demos and partnerships that address issues around simplifying the design and improving the capabilities of in-cabin monitoring, AEB and ADAS systems. These solutions help automotive OEMs differentiate their brands, while adding comfort and safety features to protect drivers, passengers and pedestrians.

Here are several highlights from AutoSens and InCabin Europe 2024, including new sensor innovations, demos and partnerships that are driving the next generation of safety and connectivity in vehicles.

Automotive sensors

Omnivision announced two automotive sensors for cameras. Both incorporate the company’s TheiaCel high-dynamic-range (HDR) technology that improves LED flicker in virtually any lighting condition. Omnivision said the TheiaCel solution achieves a wider dynamic range than earlier automotive HDR architectures.

Omnivision unveiled the new 3-megapixel (MP)-resolution OX03H10 CMOS image sensor at the show. It is the company’s first 3.0-micron (µm) pixel automotive viewing sensor with TheiaCel technology. The new sensor is designed to improve driving safety by providing improved imaging clarity in all lighting conditions for automotive surround- and rear-view cameras and other vision systems.

The OX03H10 sensor offers 140-dB HDR, LED flicker mitigation thanks to the TheiaCel technology that leverages Omnivision’s lateral overflow integration capacitor (LOFIC) technology and proprietary single-exposure DCG and split pixel high dynamic range (HDR) technology, delivering high image quality in any lighting condition.

At 1920 × 1536 resolution, the OX03H10 features 60 frames per second (fps) and provides low power consumption. The 3.0-µm pixel is based on PureCelPlus-S stacking technology for the smallest pixel and highest resolution in a 1/2.44-inch optical format, Omnivision said.

The image sensor meets ASIL-C functional safety regulations. Other features include cybersecurity and a MIPI output interface.

The OX03H10 is available for sampling now and will be in mass production in the first half of 2025. It is housed in a small a-CSP package and is pin-to-pin compatible with the OX03F10 automotive image sensor for seamless upgrades.

Omnivision’s OX03H10 image sensor for automotive viewing cameras.

Omnivision’s OX03H10 image sensor for automotive viewing cameras (Source: Omnivision)

Omnivision also introduced its 12-MP resolution image sensor with TheiaCel technology for LED-flicker-free automotive cameras in applications such as ADAS and AD. It also can be used in high-performance front machine-vision cameras.

The OX12A10 is the highest-resolution sensor in the company’s 2.1-micron TheiaCel product family, which also includes the 8-MP OX08D10 and 5-MP OX05D10 devices. The 5-MP and 8-MP sensors are designed for mainstream passenger vehicles, while the OX12A10 targets next-generation ADAS and AD and machine-vision requirements.

The OX12A10 features a 1/1.43-inch optical format and is the first image sensor to feature the company’s a-CSP+ package technology. The 12450 µm × 8100 µm size offers a more compact design than traditional ball grid array (BGA) packages, Omnivision said.

The OX12A10 is available for sampling now and will be in mass production in the third quarter of 2025.

Omnivision’s OX12A10 image sensor for ADAS and AD applications.

Omnivision’s OX12A10 image sensor for ADAS and AD applications (Source: Omnivision)

Demonstrations

Omnivision also partnered with Philips, claiming the first demo of a connected in-cabin vital signs monitoring solution, which can monitor vital signs such as pulse and breathing rate. The companies said the data may enable customization of comfort settings while driving and will also help enable timed delivery of vehicle notifications or make adaptive route and break suggestions.

The well-being monitoring prototype combines Philips’ vital signs camera for automotive software with Omnivision’s OX05B1S CMOS image sensor, a 5-MP RGB-IR backside illuminated (BSI) global shutter sensor for in-cabin monitoring systems.

The OX05B1S image sensor features Nyxel technology, which delivers what Omnivision calls industry-leading quantum efficiency at the 940-nm near-infrared (NIR) wavelength. “This enables the OX05B1S to detect and recognize objects that other image sensors would miss under extremely low lighting conditions, providing higher performance in‑cabin camera capabilities,” Omnivision said.

In addition, the solution leverages an advanced AI-enabled OAX4600 image signal processor that processes the data from the image sensor for the med-tech system.

Omnivision and Philips joint in-cabin vital signs monitoring demo.

Omnivision and Philips joint in-cabin vital signs monitoring demo (Source: Omnivision)

Lynred, a provider of infrared sensors for the aerospace, defense and commercial markets, demonstrated a prototype 8.5-µm pixel-pitch sensor, claiming the smallest infrared sensor for future AEB and ADAS systems. The live demo showed the prototype QVGA 8.5-µm pixel-pitch thermal imaging sensor embedded in a demo camera.

AEB systems will be mandatory in all light vehicles by 2029, according to a recent ruling by the U.S. National Highway Traffic Safety Administration (NHTSA). The organization believes that driver assistance technologies have the potential to reduce traffic crashes and save thousands of lives each year.  However, the European Traffic Safety Council (ETSC) previously reported that AEB systems need to work better in wet, foggy and low-light conditions.

Lynred said thermal imaging sensors can detect and identify objects in total darkness.

The prototype features half the surface of current 12-µm thermal imaging sensors for automotive applications, according to Lynred, and will enable system developers to shrink the size of thermal cameras by 50% for AEB systems, while maintaining the same performance standards as larger-sized longwave infrared (LWIR) models.

The company is preparing a roadmap of solutions to help meet NHTSA compliance. This includes preparing for high volume production of its automotive-qualified 12-µm product, enabling pedestrian AEB systems.

Lynred’s 12-µm and 8.5-µm pixel-pitch thermal imaging sensors target AEB systems.

Lynred’s 12-µm and 8.5-µm pixel-pitch thermal imaging sensors target AEB systems. (Source: Lynred)

Addressing ADAS, AD and parking applications, automotive software company LeddarTech Holdings Inc. showcases its AI-based low-level sensor fusion and perception software technology, LeddarVision. The demonstration highlights the company’s collaboration with Texas Instruments Inc. and Arm.

LeddarTech’s LeddarNavigator demonstrator highlighted the company’s advanced AI-based sensor fusion and perception software, enabled by Texas Instruments’ TDA4VE-Q1 processor and Arm’s automotive processors.

The LeddarVision Front-Entry solution (LVF-E) incorporates TI’s TDA4 family processor, which is claimed as the first-to-market low-level-fusion based perception solution deployed on a single TDA4VE-Q1 processor. The TDA4VE-Q1 processor achieves one of the lowest system costs for L2/L2+ entry-level ADAS without sacrificing system performance, according to LeddarTech.

The TDA4VE-Q1 performs low-level sensor fusion processing and provides high-performance computing for both traditional vision and deep learning algorithms and vision pre-processing acceleration for a high level of system integration at low system power and costs. The pin and software-compatible TDA4 product family enables scalability for either higher-performance or lower-cost systems for advanced single and multi-modal ADAS sensor fusion applications.

LeddarTech also has partnered with Arm to drive ADAS innovations with advanced compute platforms. The company has leveraged the Arm Cortex-A720AE CPU to minimize computational bottlenecks and enhance overall system efficiency by optimizing algorithms in the ADAS perception and fusion stack for Arm CPUs. (See Leddartech’s case study for more information about their partnership.)

Leddartech’s LVS-2+ and LVF-E solutions meet the 5-star NCAP 2025/GSR 2022 safety standards and are validated through research and development at its Tel Aviv, Montreal and Quebec City facilities. Key performance indicators are available to customers for review on request.

LeddarVision’s environmental perception framework.

LeddarVision’s environmental perception framework (Source: LeddarTech Holdings Inc.)

Automotive partnerships

Ambarella, Inc. announced that its Oculii AI 4D imaging radar technology is deployed in the 2023 and 2024 Lotus Eletre electric hyper-SUV as well as the 2024 Lotus Emeya fully electric hyper-GT from Lotus Technology. The vehicles’ L2+ semi-autonomous systems, including highway and urban navigation on autopilot (NOA) and AEB, achieves ultra-long detection of over 300 meters, which provides more time to safely react to vehicles and other objects while traveling at highway and racetrack speeds. These systems were developed by Lotus Robotics, a wholly-owned subsidiary of Lotus, as part of its autonomous-driving platform.

Ambarella said its AI software-defined architecture is capable of detecting objects over 500 meters away by centrally processing raw 4D imaging radar data and fusing it at a deep level with the vehicle’s other sensor information. It also provides the capability to shift processing power among sensors and adapt to real-time driving conditions.

The Ambarella architecture enables higher angular resolution of 0.5 degrees, and an ultra-dense point cloud with tens of thousands of detection points per frame. The Oculii AI radar algorithms also “uniquely” adapt radar waveforms to the environment, resulting in an order-of-magnitude fewer antennas required for centralized processing, and reduced data bandwidth and power consumption compared to competing 4D imaging radar solutions.

The Oculii radar technology provides ultra-fine angular resolution of one degree in both Lotus vehicles, using only six transmit and eight receive antennas on each of the vehicles’ two radar modules. This is more than double the resolution of the nearest 4D imaging radar competitor using the same number of antennas, according to Ambarella.

“This high angular resolution is important for clearly distinguishing, identifying and locating objects and people, both for nearby detection in crowded urban environments, as well as from a far distance when travelling at highway and racetrack speeds, to ensure timely and accurate avoidance and braking by the vehicles’ L2+ systems,” the company said.

Lotus designs in Ambarella’s Oculii AI 4D imaging radar technology in L2+ autonomous systems for the Eletre SUV and Emeya Hyper-GT electric vehicles.

Lotus designs in Ambarella’s Oculii AI 4D imaging radar technology in L2+ autonomous systems for the Eletre SUV and Emeya Hyper-GT electric vehicles. (Source: Ambarella, Inc.)

Leopard Imaging Inc. (Leopard Imaging), a provider of intelligent vision solutions, highlighted its  Automotive SerDes Alliance [ASA]-based cameras at AutoSens, announcing the launch of its next-generation solutions with the BMW Group. ASA supports five different downstream speed grades ranging from 2 Gbits/s to 16 Gbits/s, which is suited for camera sensors, displays and other automotive applications requiring these high data rates.

Automotive SerDes can handle the massive data loads required for next-generation camera systems in vehicles, enabling the transmission of high-resolution video over long distances within the vehicle while maintaining low latency. This is particularly critical for real-time processing in autonomous-driving applications.

Leopard showcased a demo with the BMW Group at the show.

Advertisement

Leave a Reply