Advertisement

The state of the automotive sensing industry: It’s complicated

The AutoSens conference covered a range of automotive sensing choices and challenges to help engineers evaluate the right sensors for their applications.

vehicle sensesWhat does it take to make cars see — to improve automotive sensing? The short answer: it’s complicated.

The market has lots of CMOS image sensors that can capture stunning pictures that dazzle the human eye. What about sensors that not only generate images but place them into a context that enables machines to easily and accurately digest the data?

Industry experts know that just a single sensing modality can’t do the job (though Mobileye might disagree). By mixing and matching different sensors auto— vision, radar and LiDAR, ultrasound — autonomous vehicle (AV) developers are looking for ways to orchestrate data generated by more than one sensor. They believe fused sensory data can get closer to human perception.

But there’s a complication: whatever sensors perceive at the edge won’t stay at the edge. Captured sensory data must be processed inside a vehicle to be interpreted by machines. This requires massive processing power inside the vehicle brain. It demands updated in-vehicle networks fed by a fatter pipe with very little latency. In the end, it takes a sensor village of enabling machines to make safe and sound decisions.


The state of the automotive sensing industry: It’s complicated

What does it take to make cars see better? It’s complicated. Multiple sensing modalities, faster processing power in the vehicle brain, new car architecture with updated in-vehicle network, self-cleaning sensors, common sense layer and a lot more.


Vehicle senses are manifestly complicated. To make some sense of it, we wrote the book on the subject. The just-published volume, entitled “Aspencore Guide to Sensors in Automotive – Making Cars See and Think Ahead,” is available at the EE Times Store.

But these are not laurels we’re resting upon. Technology continues to advance. To better understand AV system designers’ challenges, we turned to AutoSens. The AutoSens conference is built on a forum where experienced engineers and professionals gather to compare notes on driver-assist (ADAS) and AV development. Produced by Sense Media, AutoSens has hosted a series of conferences over the last several years.

This year’s event was of course virtual. We sent AspenCore’s best editors to the latest, virtual AutoSens Brussels Edition*, which just concluded. They came back with a range of technology/product stories. Their reporting is the basis of a Special Project, offering readers a snapshot of the current state of the automotive sensing world.

What we learned

Of all subjects, “sensor mix” is an eternally popular topic that spawns disparate opinions; no single right answer exists. Questions include what exactly is the right sensor mix, and how best to optimize it.

Anne-Françoise Pelé, editor-in-chief of EE Times Europe, captured the debate at the recent AutoSens conference. Pelé also covered the broadly diverse LiDAR landscape, where demand, technology and prices are continuously shifting.

Gina Roos, editor-in-chief of Electronic Products, cut to the chase and discussed the biggest pain points for automotive image sensors. Can your vehicle’s lane-keeping feature perform even when the paint on the lane marker is faded or obscured by rain? Does your car recognize a red light when the traffic signals are flickering LEDs?

EE Times also caught up with Ross Jatou, vice president and general manager of the automotive solutions division at On Semiconductor during virtual AutoSens. In our chat, we discussed topics ranging from driver monitoring systems and LiDARs to edge processing, NCAP (New Car Assessment Program) and the changing relationship between car and driver.

Sensor degradation, an issue little covered and yet fraught with potentially huge consequences, popped up. As cars rack up miles, cameras and radar will inevitably be obscured by mud, leaves and other real-world messiness. How do you keep sensors clean? Further, the quality of sensors will eventually deteriorate with age, weather, wear and tear. How will robocars know it’s time to get a new pair of glasses? Majeed Amhad, EDN’s editor-in-chief, explored those issues with Rob Stead, an organizer of AutoSens.

AutoSens is all about perception. But the next phase for the conference, I suspect, is how best to add a “common sense” layer to robocar perception. EE Times recently explored efforts to make sense of how “driving policy” like Responsibility-Sensitive Safety connects to perception. Highlighting this Special Project is Rob Stead’s essay, “Safety, Not Autonomy, Is the Objective.” Stead wrote:

With all the hype about robotaxis and the utopic future of mobility over the past five years or so, we have lost sight of what autonomy was all about in the first place. Right now, the elastic motion of the sine wave is bringing us back around to focus on what the original objective was, namely, safety.

We couldn’t agree more. *While AutoSens Brussels Edition just concluded, the event organizer will start AutoSens Detroit Edition next month.

Articles in this Special Project:

The state of the automotive sensing industry: It’s complicated6 Considerations for Integrating Sensors in Vehicles

By Anne-Françoise Pelé

There is no one formula for implementing sensor technology for assisted driving systems. Here are a half-dozen options.    

 

The state of the automotive sensing industry: It’s complicatedLiDAR Market: Promising, But Caution Needed

By Anne-Françoise Pelé

LiDAR innovation has moved fast, and automotive applications are expected to be the main drivers in the next five years.

 

  The state of the automotive sensing industry: It’s complicatedSafety, Not Autonomy, is the Objective

By Rob Stead

The sine wave of safety has brought the auto industry back into focus.

 

 

The state of the automotive sensing industry: It’s complicatedOmniVision address biggest pain points for automotive image sensors

By Gina Roos

OmniVision’s latest products focus on the must-haves in automotive image sensors – high resolution, increased HDR, LFM and low power consumption.

 

The state of the automotive sensing industry: It’s complicated

In-vehicle connectivity poses big challenges for automotive industry

By Gina Roos

Auto OEMs face four key roadblocks when it comes to in-vehicle connectivity: limited bandwidth, too many cables, distance limitations, and harsh environments.  

 

The state of the automotive sensing industry: It’s complicated3 basic facts about automotive sensor degration

By Majeed Ahmad

Sensor degradation is intrinsically linked to the most crucial issue in the automotive design world: safety.

 

 

The state of the automotive sensing industry: It’s complicatedSensor fusion is a prerequisite for autonomous vehicles

By Anne-Françoise Pelé

When sensor fusion maps the road to full autonomy, many technical challenges remain.

 

 

The state of the automotive sensing industry: It’s complicatedQ&A with ON Semi’s Ross Jatou on ADAS, DMS, Car & Driver

By Junko Yoshida

An interview with On Semi’s Ross Jatou about his company’s growth strategy in the ADAS and autonomous vehicle markets, and about the market itself.

 

The state of the automotive sensing industry: It’s complicatedMaking Sense of ‘Driving Policy’ inside Robocar Brains

By Junko Yoshida

In covering automotive electronics, I write about ‘perception.’ I’ve also written about ‘driving policy.’ But it wasn’t until recently when I’ve finally wrapped my head around what driving policy does to perception.

 


The state of the automotive sensing industry: It’s complicatedA new book, AspenCore Guide to Sensors in Automotive: Making Cars See and Think Ahead, with contributions from leading thinkers in the safety and automotive industries, heralds the industry’s progress and identifies the engineering community’s remaining challenges.

It’s available now at the EE Times bookstore.


Advertisement

Leave a Reply