The Wearable Device That Could Unlock a New Human Sense

A neuroscientist and his team have created a vest to help deaf people hear through a series of vibrations.

Circuitry translates signals from the app into vibrations on the vest. (Jeff Fitlow/Rice University)

In March, the neuroscientist David Eagleman stood on stage to give a TED talk on sensory substitution, the idea of replacing the duties of one sense by using another. He spoke of how little of the world humans are able to sense, despite the sophistication of the tools available—the eyes, the ears, the nose, and so on—because humans can only interpret so much. Visible light, for example, makes up just a sliver of the electromagnetic spectrum, and the inner ear can't pick up slight sounds made a mile away.

Humans are limited by their sensory world, so when they lose a sense, they can try to replace it (like cochlear implants for the deaf) or redirect it to another sense (like braille for the blind). Eagleman's work focuses on the latter, but he also wanted to use technology to add senses, not just replace or redirect them. Which is why he created the VEST:

The VEST, or the Versatile Extra-Sensory Transducer, is a wearable tool that allows the deaf to, as Eagleman puts it, "feel" speech. An app downloaded onto a smartphone or tablet with a microphone will pick up sounds and send them via bluetooth to the vest. The vest will then "translate" those sounds into a series of vibrations that reflect the frequencies picked up by the mic by using a network of transducers—devices that can convert the signals into vibrations. So, if you spoke to the person wearing the vest, that person would "feel" what you're saying through vibrations on their back, instead of through their ears.

But Eagleman is quick to point out that the vest isn't just translating the sounds into a code—the patterns felt aren't a "language" to be interpreted like braille. In fact, the device doesn't use a specific language; it responds to all ambient noises and sounds.

"The pattern of vibrations that you're feeling [while wearing the VEST] represent the frequencies that are present in the sound," he said. "What that means is what you're feeling is not code for a letter or a word—it's not like morse code—but you're actually feeling a representation of the sound."

It's an idea that hasn't been done in wearable technology just yet. The Apple Watch may assign different patterns of vibrations to mean different things—one pattern for an incoming text, for example, and another for an incoming tweet—but that's still assigning meaning to feeling. The VEST, Eagleman emphasized, doesn't follow that theory.

In order to create something that can transform all sounds into vibrations, Eagleman needed plenty of hardware. To help shrink the parts built into the vest, Eagleman enlisted six electrical and computer engineering students from Rice University to work with his lab. Eagleman, the team, and Scott Novich, a doctoral student working in Eagleman's lab, created prototypes, with the latest to be unveiled this week:

The task hasn't been easy. "The idea itself was very simple: It's taking sound with the phone, doing some calculations to it, and sending it over to the vest which sends vibrations to the body," Eric Kang, one of the Rice student team members, told me. "But when we tried to build it, all we encountered were issues after issues."

Space issues, mostly. The VEST had to remain as compact, small, and light as possible while comfortably carrying the 32 to 48 transducers and motors required to transmit the vibrations and the circuitry that receives the signals from the app.

So far, it all works, Eagleman said. The team tested a prototype on a 37-year-old deaf man who, after five days of wearing the VEST, understood the words said to him out loud by feeling the vibrations because, as Eagleman put it, "his brain is starting to unlock what the data mean."

That "unlocking" phenomenon, like adding a new sense, is hard to explain. How do a series of vibrations that supposedly reflect sound eventually have meaning when there's no language assigned to them? How does the brain on the first day have no idea what a couple of vibrations on, say, the lower back means, but by the fifth day, know that they form a specific word?

"My view is that the brain is a general-purpose computational device," Eagleman told me. "You could take any kind of data stream and the brain will figure it out. I consider it the biggest miracle no one's heard of."

And that miracle could have more applications than simply allowing people to "feel" sound. John Yan, another Rice student team member, says gaming could be a lucrative field for the haptic devices. "Controllers rumble," he points out. "Virtual reality could be the most immediately commercializable field." Eagleman, meanwhile, thinks the VEST can unlock robotics by helping humans feel what robots feel. Pilots controlling a quadcopter or drone could "sense" the robot's movements from the ground. Astronauts could "feel" the health of the International Space Station through a series of vibrations that report on the its status. People could "see" 360 degrees, not by using their eyes, but by using bluetooth or wi-fi to pick up on some other form of feedback that humans can't sense yet.

Sure, all of those potential applications have a tinge of science fiction to them, but Eagleman himself has donned the vest and "felt" what it can do. At the end of his TED talk, he heard the audience's applause. But with the VEST on, he felt the tingling of the vibrations moving across his back, too.