Advertisement

Are smartphones smart enough to read your emotions?

Will smartphones eventually be able to process our facial expressions and complete emotion aware computing?

smartphonecamera

Image via Navarrowwright

Smartphones have the frontal camera that’s constantly staring at our faces. But what if it started looking at your emotions? Every time we gaze down at our phone to check updates or make phone calls, we are also greeting our smartphone’s frontal camera that could potentially analyze our faces through advanced technology. Yes, phones can respond to our vocal commands and understand what we’re saying; but will smartphones eventually be able to process our facial expressions and complete emotion-aware computing?

Some current start-ups have started to provide the answer to this query, attempting to capitalize on users’ emotions and commercializing this endeavor. The brand Affdex would use customized tech to read people’s expressions and establish a connection between the user, advertising, and brands. Some companies offering similar ideas for this technology include Emotient, Realeyes, and nViso

These programs will use the built-in frontal camera on your smartphone to encode and process your emotions, opening apps, advertisements, and promotions on the phone to coincide with the user’s emotion. For example, when the person appears happy, the smartphone will open the list of items that person purchased the last time he/she was feeling giddy. If the user appears angry while playing a game, the phone will lower the level’s intensity. When the user’s facial expression denotes sadness, a funny advert will pop up on the phone’s screen.

These apps will surely test whether we can indeed read a person by the expression on his/her face. As explained by psychologist Paul Ekman in the 20th century, people will frequently judge the emotional state of another by attempting to interpret facial expressions. By a quick glance, we’ll either see fright, disgust, or sheer happiness.

But this is extremely hard to interpret. A smile is a universal “hello,” and doesn’t necessarily mean that the person wearing it is indeed happy. The person smiling could be happy, embarrassed, or even frustrated. And even if we are alone with our phone, will we be smiling? If we are happy but not smiling, how will this then be classified as happy by our smartphone?

These apps would work better if there was some sort of way for the entire body to be analyzed, looking at body language, posture, voice, and contextual cues. It doesn’t seem plausible for this system of camera-face analysis to accurately work. Our smartphones are dumb. They do not currently contain ample intelligence to draw conclusions based on our expressions. An algorithm integrated into the smartphone would, in effect, be more efficient than just relying on the built-in camera.

Apple’s app that outlines an algorithm that incorporates how people use their smartphones is currently being considered by the U.S. Patent Office. Users’ activity patterns, like switching between apps, website history, and data used, would be analyzed. Vital signs like heart rate and blood pressure would also be looked at in tandem with facial recognition to interpret the user’s emotional state.

But do we really want this technology?  Is it really necessary to have smartphones interpret our every move and project its “smart” interpretation of how we’re feeling.

Story via Phys.org

Sign-up banner

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply