Making a Smart(er) Phone—One That's Able to Read Your Emotions

More

What do neuroscientists and salesmen have in common? Virtually nothing would make them happier than the ability to read your mind: the former in order to understand how the brain works, the latter to leverage that information to sell you things (apologies if you were expecting a punch line to the opening questions). People are generally very good at reading emotions in others based upon factors such as tone of voice, facial expression, and body language. But what about machines? Through artificial intelligence and principles of human-computer interaction can they be taught to read emotions too?

This is precisely what a team of researchers at the Samsung Advanced Institute of Technology in South Korea are working toward by enabling smartphones capable of inferring user emotion. As reported in MIT's Technology Review:

Rather than relying on specialized sensors or cameras, the phone infers a user's emotional state based on how he's using the phone.

For example, it monitors certain inputs, such as the speed at which a user types, how often the "backspace" or "special symbol" buttons are pressed, and how much the device shakes. These measures let the phone postulate whether the user is happy, sad, surprised, fearful, angry, or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology's Intelligence Group, in South Korea. Lee led the work on the new system. He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one's mental state, which the software's machine-learning algorithms can detect with an accuracy of 67.5 percent.

Once a phone infers an emotional state, it can then change how it interacts with the user:

The system could trigger different ringtones on a phone to convey the caller's emotional state or cheer up someone who's feeling low. "The smartphone might show a funny cartoon to make the user feel better," he says.

As another example, one could imagine that our phones and Siri-esque assistants may peek at our calendars and be able to tell when we are busy, and therefore more likely to be stressed. Similarly, contextual cues may be used in conjunction to predict emotional state, e.g. whether the user is experiencing bad weather or a traffic jam based on geolocation. We at Medgadget are excited to see how this innovative application of technology develops.  Maybe in the not too distant future our phones will also double as our psychiatrists.


This post also appears on medGadget, an Atlantic partner site.

Jump to comments
Presented by

medGadget is written by a group of MDs and biomedical engineers.

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Health

Just In