Advertisement

Eye tracking – a new form of interaction between man and machine

Communication by eye contact makes using electronics more intuitive

By Christoph Goeltner
Product Manager
Osram Opto Semiconductors
www.osram-os.com

Powerful computer chips, highly efficient infrared LEDs and modern camera sensors now make it possible for previously complex eye-tracking systems to be adapted for consumer applications. These systems enable devices to detect user’s eye movements and recognize what the user would like to do next. In conjunction with established input methods, eye tracking opens up a wealth of new intuitive interactions between man and machine.

fapo_Osram01_EyeTracking_aug2016

Activation by eye contact. Eye tracking in conjunction with established input functions opens up completely new ways of operating electronic devices. Source: Osram

For decades the keyboard and the mouse have been our traditional tools for operating computers. With the arrival of mobile devices such as smartphones and tablets, new techniques were required to communicate efficiently with these mini-computers which have neither a keyboard nor a mouse. Touchscreens became a key technology for making these electronic companions user-friendly. The addition of speech recognition made it all even more intuitive.

The device that anticipates

The amount of electronic equipment we want to communicate with will increase still further as the internet of things becomes commonplace. A good example is the smart home, for which voice-controlled thermostats are already available. We will also be interacting with robots in the future. Apart from interactive industrial robots, there are robot assistants for the home and for the health care sector already in development. Today, devices can receive commands via touchscreens and hear what users want through their built-in microphones. With the aid of eye-tracking systems they would also be able to detect what users are looking at and anticipate what they want to do next. This will open up a whole series of new possibilities for intuitive interaction between man and machine.

In many cases the hardware for eye tracking is already available

Eye tracking systems detect a person’s eye movements and the direction in which they are looking. Originally they were developed for market research, behavior analysis and usability studies. And they have also been in use for some time to help people who no longer have the use of their hands to operate computers. Many of these systems use infrared light to illuminate the user’s eyes, take a picture with a camera and calculate eye movement from the image data. Such systems needed special high-quality cameras, light sources and software. Sometimes hardware accelerators were added to process the huge amount of graphic data. Today, extremely powerful chips, compact camera sensors and modern high-power LEDs enable eye-tracking functionality to be integrated in consumer devices such as smartphones. What’s more, in many devices the camera sensor and the infrared light source are already being used for other functions such as facial recognition or iris identification. All that is then needed is an appropriate software to integrate eye tracking as an additional feature.

Current developments

Concepts that enable eye tracking to be used as a new additional man/machine interface are currently being developed in various areas. Smartphones and tablets with eye-tracking functionality have already been demonstrated in which eye contact is used to activate an icon or move a character on screen. Gaming computers with eye tracking give gamers a sense of being more involved in the action. Systems have been presented in which a gamer can use eye movements to control the viewpoint of an on-screen character instead of using a mouse or trackpad. Eye tracking can be used in much the same way for computers – for example by using your eyes to scroll through a document. In the smart home sector, too, there are ways of using eye contact to communicate with a wide range of devices. Smart TVs with eye tracking have already been demonstrated, for example. Possible applications for these systems have also been proposed in the automotive sector. A good example is the driver activity assistant which monitors the driver’s eyes to detect signs of fatigue. An eye-tracking function could also be used to detect the direction in which the driver is looking and determine whether he is paying attention to the road ahead or being distracted. Such information will help to avoid critical situations on the road.

Eye-tracking systems for the consumer sector

Modern eye-tracking systems for the applications mentioned above are based on infrared LEDs (IREDs) for illuminating the eyes and a high-resolution camera sensor that registers the light reflected by the eyes. Image processing algorithms take this raw data and calculate the positions of the pupils (Fig. 2 ). Using information about the position of a reference object, such as the screen, special software is able to determine where exactly the user is looking. Infrared illumination guarantees the necessary contrast between the iris and the pupil, whatever the eye color, particularly in the dark or if the screen background is very bright.

fapo_Osram02_EyeTracking_aug2016

Fig. 2: An eye tracking system illuminates the eyes with infrared light and captures an image with a camera sensor. The image data is used to determine the positions of the pupils and calculate the direction in which the user is looking. Source: Osram

Such systems currently have a range of up to one meter. For smartphones and tablets the typical working distance is around 30 cm, for desktop computers around 60 cm. The resolution on the screen corresponds to the raster size of the eyes and is about 1 cm for tablets and about 2 cm for computers. The number of IREDs used and the specific arrangement of emitters and camera depend on the type of application, in other words on the working distance and the size of the area to be covered. The setup can also vary with the eye-tracking software used because the geometry of the design depends also on the ability of the algorithms to reliably detect the orientation of the pupils. Generally speaking, the emitter and camera sensor have to be arranged at a certain angle and at a certain distance with respect to one another to avoid glare from spectacles or direct reflections from the eyes to the sensor. The greater this distance, the better the signal quality and the more flexible the choice of the optimum distance between the user and the device.

Infrared LEDs for eye-tracking systems

Unlike iris scanners, for example, which mostly need a specific wavelength, eye tracking systems operate within a broad spectral range. Often the systems make use of the existing iris scanning or facial recognition systems together with the existing IREDs with wavelengths of 850 or 810 nm. IREDs with a wavelength of 850 nm are perceived by the human eye as a weak red glow. Many manufacturers of eye-tracking solutions prefer 940 nm as this light is virtually invisible to the naked eye. At present, 940 nm designs have the disadvantage, however, that the camera sensors currently in general use are optimized for visible light and have lower sensitivity in the infrared spectrum. At 940 nm this reduction is so significant (Fig. 3 ) that infrared illumination would have to be boosted in order to achieve the same signal strength as with an 850 nm light source. However, in view of the huge array of different applications based on infrared illumination, camera manufacturers are working on new versions with good infrared sensitivity.

fapo_Osram03_EyeTracking_aug2016

Fig. 3: The sensitivity of standard camera sensors that are optimized for visible light drops significantly between 850 nm and 940 nm. Eye tracking systems with 940 nm IREDs have the advantage that they are barely perceptible to the human eye. In most cases, they have to compensate for the reduced signal level with higher operating currents. Source: Osram

Ideally, both eyes should be within the capture area of the camera sensor. It is important for the entire eye to be evenly illuminated. The amount of infrared light that is needed depends on the working distance and may be several watts, even for mobile devices. To keep thermal output due to the high operating currents as low as possible, the emitters are operated in pulsed mode. Despite this, thermal management is an important aspect of the design, particularly in ever lighter and thinner smartphones and tablets. In this connection the efficiency of the IRED is a major factor in addition to the optical output. The greater the efficiency, the less heat is generated.

For such applications Osram has developed the Oslon Black Series and has achieved a record efficiency of 48 percent with SFH 4715A. The 850 nm emitter typically delivers an optical output of 770 mW at 1 A and at present it is the most efficient IRED at this operating current. Even higher outputs are achieved by stack versions in which two emission centers are provided per chip with the aid of nanostack technology. SFH 4715AS typically produces 1340 mW of light at a current of 1 A. Two versions with emission angles of 90 and 150 degrees cover a wide range of different designs. The Oslon Black version with an optical output of 990 mW at 1 A is ideal as a 940 nm light source.

fapo_Osram04_EyeTracking_aug2016

Fig. 4: The Oslon Black SFH 4715AS is one of the most powerful IREDs currently available with a wavelength of 850 nm. It provides 1340 mW of light at a current of 1 A. Thanks to its low height it will fit not only in present-day smartphones but also in the next generation of devices.  Source: Osram

A particular feature of the Oslon Black is the low component height of only 2.3 mm. This IRED therefore fits not only in present-day smartphones but also in the next generation of devices despite the trend toward thinner devices.

Like any application featuring infrared light sources, eye-tracking systems must comply with eye safety standards. The amount of infrared radiation that reaches a normal user is relatively low. However, precautions must be taken for situations where a technician, for example, may be at risk of looking at the infrared light source from close up. A proximity sensor linked to the eye-tracking system ensures that in such cases the IRED is switched off. Further information on the safe design of optical systems can be found, for example, in the Osram Application Note on Eye Safety. 

We are surrounded by so many complex electronic devices that new additional techniques are needed for intuitive interaction between man and machine. The combination of infrared illumination and camera sensors provides the basis for a wide range of interactive techniques in which a device can see its users and interpret their intentions. The example of eye tracking shows how on the basis of this hardware new types of interaction can be implemented as software solutions. Innovations in hardware components are also driving this development. There is a trend, for example, toward using light sources with a wavelength of 940 nm. Osram is therefore expanding its portfolio of IR emitters for facial recognition, eye tracking and similar applications.

Advertisement



Learn more about Osram Opto Semiconductors

Leave a Reply