The age of sensors is upon us. These days, it's unusual to experience an electronic consumer product that doesn't use sensors to create new experiences for its users. As micro-electromechanical systems (MEMS) technology becomes less expensive and further miniaturized, it is fueling the penetration of sensors into new applications and creating new opportunities in the sensor market.
Sensors are now found in a wide variety of applications, such as smart mobile devices, automotive systems, industrial control, climate monitoring, oil exploration, and healthcare. Used almost everywhere, sensor technology is now beginning to closely mimic the ultimate sensing machine: the human being.
The technology that allows this to happen is sensor fusion, which leverages a microcontroller (a “brain”) to fuse individual data collected from multiple sensors to get a more accurate and reliable view of the data than one would get by using the data from each discrete sensor on its own. For sensors, it is certainly true that the whole is much greater than the sum of its parts.
How sensor fusion works
Sensor fusion technologies give users an enhanced experience, leveraging and combining 3D accelerometers, 3D gyroscopes and 3D magnetometers (which measure the components of the magnetic field in a particular direction, relative to the spatial orientation of a given device). Each of these sensor types provides unique functionality, but also has limitations:
- Accelerometer: x-, y-, and z-axis linear motion sensing, but sensitive to vibration.
- Gyroscope: pitch, roll, and yaw rotational sensing, but zero bias drift.
- Magnetometer: x-, y-, and z-axis magnetic field sensing, but sensitive to magnetic interference.
When combining all of these technologies, sensor fusion takes the simultaneous input from the multiple sensors, processes the input and creates an output that is greater than the sum of its parts. That is, by using special algorithms and filtering techniques, sensor fusion eliminates the deficiencies of each individual sensor.
The most important aspect of sensor fusion is providing context to the situation being analyzed. It would be impossible to provide situation-appropriate information without knowledge of the context, and ideally this context can get provided in real time, as the action is taking place, and without the need for human intervention.
Sensor fusion often refers to a combination of a 3D accelerometer, a 3D gyroscope and a 3D magnetometer. This configuration is called a 9-SFA (nine sensor fusion axis) solution (see Fig. 1 ), which affords the user 9 DoF (nine degrees of freedom). In 2012, Freescale introduced a 12-axis Xtrinsic sensor platform for Windows 8 that offers a 12-DoF sensor fusion solution. By combining a 3D accelerometer, a 3D gyroscope, and a 3D magnetometer (9 DoF) with a barometric sensor, a thermal sensor, and ambient light sensor, this hardware/software solution fuses accelerometer, magnetometer, and gyroscope data using a 32-bit microcontroller to provides ease of integration for streamlined development.
Fig. 1; In this nine-degree-of-freedom (9-DoF) system block diagram, outputs of a 3D accelerometer, a 3D gyroscope, and a 3D magnetometer are combined to produce more information for the user. This approach was codified by Microsoft for its Windows 8 software system.
Today a variety of body-worn sensors are being used for ambulatory monitoring of gait (gait is the pattern of movement of the limbs during locomotion over a solid substrate/ground). A variety of sensors such as ECG, EMG, accelerometers, magnetometers, gyroscopes, thermometers, light sensors, and vibration sensors are all being used in a variety of wearable devices in clinical settings for patient gait monitoring and rehabilitation. New classes of wearable devices will bring these capabilities and more for everyday use. Advances in miniaturization of MEMS devices, small and very powerful yet energy-efficient MCUs, and low power connectivity technologies, have enabled a new category of wearable consumer medical devices that focus on monitoring s personal health trends, rather than waiting for chronic conditions to set. Wearable devices, along with gateways that tie together other types of sensor information, will enable regular biometric readouts to health care providers, and along with personal health records (PHRs), provide cloud-based remote medical, or telehealth, services (see Fig. 2 ).
Fig. 2; In the telehealth example shown here, wearable devices that use a Bluetooth protocol communicate with a mobile handset that connect to a healthcare service provider, who also has access to personal medical records stored on a secure remote server.
The potential for wearables has also given rise to a movement called self-quantification, where individuals incorporate technology into the acquisition of data on certain aspects of the daily life of a person being examined. The various inputs to the person being quantified include various mental and physical activities, inputs to the body such as food or air quality,and other factors such as blood sugar and oxygen levels, blood pressure, heart rate, body temperature, hunger, and other mood related information, at the time, before, and after various stimuli to the person. This is almost an automatic self-monitor which combines wearable sensors and wearable computing to fuse the variety of information and apply context to create an assessment that is much greater than the individual readouts.
In telehealth situations, the continuous monitoring of patient biometrics and fusing of the various factors will lead to faster and better care of chronic diseases, versus a doctor visit every three to six months.
Other sensor fusion examples
A medical doctor and electrical engineer, Dr. José Fernández Villaseñor combines his work as a Freescale medical product marketer and a hospital physician in his study of emotion analysis using sensors. Research shows that heart rate increases due to physical activities have a different pattern and slope than increases due to adrenalin from excitement. Hence, one can algorithmically analyze sensor data to electronically detect the types of emotion a person is displaying. This can lead to new modes of health analysis.
Consider as an example a gaming platform that can detect emotions electronically (Fig. 3), by monitoring and acquiring data from physiological variables and states such as:
- Muscle relaxation — via a pressure sensor.
- Muscle contraction — via a pressure sensor.
- Attitude — via an accelerometer (monitoring a person's state of relaxation as indicated by jerky movements versus steady hands).
- Heart rate variability —via a two-electrode ECG on a chip.
- Sweat — via a capacitive sensor.
Fig. 3; With sensor fusion, the use of multiple sensors to monitor a person's physical state during a video game can be used to diagnosis the player's emotional state and provide feedback to improve gamesmanship as well as his or her health.
Using the sensor data collected, a microcontroller in the game platform could, for example, detect emotions and give the gamer feedback during game situations to make the game more exciting. How about making turns faster and more difficult to maneuver in a driving game until the gamer shows a more relaxed state (a less jerky reading from the accelerometer)? Hence, the calm driver with better command over his or her emotions will have a better score (similar to real life).
Emotion sensing can also be applied to the information gathered for telehealth, self-quantification and other fields, providing better context. Sensor fusion raises the bar on the application of sensors in medical electronics and the benefits become limitless.
Learn more about Freescale Semiconductor