Advertisement

How does an AMR use time-of-flight technology?

AMRs powered by time-of-flight technology deliver benefits in mapping, navigation, localization and obstacle detection and avoidance in warehouse operations.

When we talk about time-of-flight (ToF) technology, the most important aspect to consider is embedded-vision technology that has drastically changed over the years. From the 1970s when it was first theorized to the current day, its far leaps in technology have blazed a new era in advanced imaging for autonomous mobile robots (AMRs). The most common uses of AMRs are in industrial warehouses, in which ToF technology plays a pivotal role, helping the robots to perceive their surroundings with an optimal level of accuracy.

What is ToF technology?

ToF calculates how long it takes for a thing, particle or a wave to move from Point A to Point B. Basically, both sonar and ToF technologies are similar. ToF calculates the distance between objects by measuring the time it takes for a signal’s transmission emission from Point A and its return after being deflected from the object to the source detector. The typical signals used with ToF technology are sound and light. The sensors that use light signals do so by calculating the time it takes for the light to be reflected back from the object to the source.

How it works

The ToF camera system uses an illuminating light source, typically laser diodes or LEDs, and a sensor to measure distance. The light source sends out a pulse and the sensor pixels measure the time it takes for the light to return and determine the distance to the object (i.e., Point A) based on the length of the round trip.

Key components of a time-of-flight (ToF) camera system.

Key components of a ToF camera system (Source: e-Con Systems)

How ToF technology revolutionized mobile robots

The modern applications of stereo-vision technology, with the use of an IR pattern projector, irradiate the surroundings or scene to compare the differences of the images from the 2D sensors, which delivers a high level of low-light performance to determine the range or distance of the objects. In comparison, the ToF technology takes this a notch higher with the use of a sensor, a lighting unit and a depth-processing unit to calculate depth for mobile robots.

Hence, they can be leveraged out of the box without further calibration. While performing scene-based tasks, the ToF camera is placed on the mobile robot to extract 3D images at high frame rates with rapid segmentation of the background or foreground. Because ToF cameras also use active lighting components, mobile robots can perform tasks in brightly lit conditions or in complete darkness.

How a time-of-flight (ToF) camera measures depth.

How a ToF camera measures depth (Source: e-con Systems)

ToF-powered mobile robots in automated warehouses

The role of ToF cameras is well-defined in warehouse operations. These cameras are used to equip AMRs and automated guided vehicles (AGVs) with depth-sensing intelligence. They help them perceive their surroundings and capture depth imaging data to undertake business-critical functions with accuracy, convenience and speed. These include:

Mapping

These cameras create a map of an unknown environment by measuring the time of transit of the light reflected from the object to the source. It uses the simultaneous localization and mapping (SLAM) algorithm, which requires 3D depth perception for precise mapping. For instance, using 3D depth sensing, these cameras can easily create predetermined paths within the premises for mobile robots to move around.

Navigation

AMRs tend to move on a specified map from Point A to Point B, but with SLAM algorithms, they can also do so in an unknown environment. By successfully making use of ToF technology, AMRs can process information quickly and analyze their environment in 3D before deciding their path.

Obstacle detection & avoidance

AMRs are likely to cross paths with obstacles during the course of their navigation in a warehouse, which makes it a priority for them to be equipped with the ability to process information fast and precisely. With these sensor-based ToF cameras, AMRs can redefine their paths if they run into an obstacle.

Localization

Usually, the ToF camera helps AMRs with the identification of objects on a known map by scanning the environment and matching real-time information with pre-existing data. This requires a GPS signal, but that proposes a challenge in indoor environments like warehouses, so an alternative solution is a localization feature that can be operated locally. With ToF cameras, they can capture 3D depth data by calculating the distance from the reference points in a map. Then, using triangulation, the AMR can pinpoint its exact position. This enables seamless localization, making navigation easy and safe.

About the author

Maharajan Veerabahu, co-founder, e-con Systems.Maharajan Veerabahu is co-founder and vice president of product design services of e-con Systems. He and his co-founders, Harishankkar Subramanyam and Ashok Babu Kunjukkannan, got together early in their professional journey and decided to leave their well-paying jobs to take a plunge into entrepreneurship.

 A first-time entrepreneur, Veerabahu has a deep passion for building products. His technical expertise lies in his unmatched understanding of engineering solutions, developing new technology teams and strengthening the core of the organization.

 Veerabahu is an engineering graduate from the Thanthai Periyar Government Institute of Technology, Madras University. Soon after his graduation, he started e-con Systems, which designs, builds and ships high-quality OEM camera modules to the U.S., Germany, Japan and more than 30 other countries outside of India. An end-to-end OEM camera solutions provider, e-con Systems is a one-stop shop for any product company looking to integrate a camera module into their device.

Advertisement

Leave a Reply