Advertisement

How this new sensor gives self-driving cars a human-like view of the world

AEye develops advanced-vision hardware, software, and algorithms that act as the eyes and visual cortex of autonomous vehicles

Eye

AEye's image-recognition algorithms adjust on the fly. Image source: AEye.ai.

By Warren Miller, contributing writer

Self-driving cars are considered by many to be the future of transportation, but there are still a few hurdles to navigate on the way to completely autonomous driving. Most self-driving vehicles in development currently use LIDAR (light detection and ranging) sensors — technology that uses lasers to create three-dimensional representations of its surroundings, similar to the way that deep sea exploration vehicles use SONAR to map the ocean floor. This might sound like an amazing technological innovation, but LIDAR sensors have limitations that make it difficult for them to accumulate and process data the way that human beings do while they’re driving. To create the automotive utopia of the future, self-driving cars will need better sensors.

Enter Luis Dussman and his startup, AEye — he founded the company to improve self-driving technology but soon found that available sensor technology wasn’t going to meet his needs. “We realized we had to build our own hardware, so we did,” Dussman told MIT.

LIDAR has two significant drawbacks — it’s currently too expensive to implement commercially, and the lasers that the sensors use to build its 3D maps are restricted to predetermined angles. Imagine that you’re driving in San Francisco and you’re making your way up one of its extremely steep hills — some of the LIDAR lasers will be pointed at the sky. Dussman set out to make a laser device that can steer its beams in the directions that seem the most critical to creating a safe driving experience while focusing less on the areas on the periphery, mirroring the way that human drivers focus on what’s directly in front of them while still maintaining some awareness of the what’s happening to their right, left, and rear.

AEye’s system uses high-speed image-recognition algorithms to do just that. It can adjust on the fly to its circumstances and surroundings according to the demands of a car’s onboard self-driving computer software. “You can trade resolution, scene revisit rate, and range at any point in time,” said Dussan. “The same sensor can adapt.” If you were driving on a lightly crowded freeway, for instance, the sensors would probably be trained directly in front of the vehicle. If you were driving slowly down a city street in Manhattan, however, the sensors might be scanning the rows of parked cars on either side for pedestrians about to cross to the other side or children running out into the street in pursuit of a ball. The AEye device can also incorporate color into the images that it processes, which would be useful for identifying brake lights or the reflective vests that joggers wear when they go running in the early morning or late evening.

There are still a few hills to climb before the AEye system can be incorporated into commercially available vehicles, however. Currently, the device can only survey a 70° range of vision, meaning that a car would need a half-dozen such sensors installed intermittently around its body to achieve 360° coverage. Although AEye hasn’t publicly stated how much its system costs, similar high-end sensor systems like the one offered by the market leader, Velodyne, are said to be prohibitively expensive and, therefore, impractical for mass implementation in commercially sold cars.  While Dussan admits that his company’s system is designed for a more high-end market, he said, “If you compare true apples to apples, we’re going to be the lowest-cost system around.”

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply