Courtesy of Daniel Wilson

When he was in grad school, the roboticist Daniel Wilson installed 150 binary sensors in his house. They ranged from infrared motion sensors—the kind you find in taps and towel dispensers in public washrooms—to audio sensors, laser break beam sensors, and contact switches hooked up to overhead lights, furniture, and appliances.

Over the next two years, Wilson collected data on every aspect of his daily routine, from how long he spent in the shower to how many times a day he opened the cutlery drawer. Pressure mats fixed to the bottom of his couch and chairs recorded how long he spent sitting down; a small, wireless microphone allowed him to turn lights on and off using just his voice. He even built a wireless toothbrush to record the time and length of his oral hygiene habits. “I was like a mad scientist,” Wilson said.

Wilson was testing a "smart environment"—a machine learning algorithm that collects information from sensors in order to simultaneously track and recognize the activity of the occupants within its range. In other words, the system’s goal was to track a person’s movements as they went about their daily routine, collecting data and matching it against an existing database to check for abnormalities. Wilson was thinking expressly of the elderly and disabled when he designed the system for his Ph.D. thesis at Carnegie Mellon’s Robotics Institute. Was it possible, he wondered, to design something that would allow a relative or caregiver to monitor someone around the clock from an offsite location?

Wilson had hit upon the idea after talking to his mother, a registered nurse and case manager who drove around to check up on patients who lived in isolated or remote areas. Since most people would draw the line at having cameras installed in their living room or bathroom, a discreet sensor-based system could allow nurses like Wilson’s mother to keep track of their patient’s movements without invading their privacy. This would also allow people to live in their own home longer, without having to be shipped off to a nursing home. “It would be like watching a ghost move around in an environment,” Wilson said. “You wouldn’t actually be able to see the person, but you could see a drawer opening and closing, a light being turned on, a chair being sat in.”

Courtesy of Daniel Wilson

Wilson completed his Ph.D. in 2005. The same year, he wrote a nonfiction humor book titled How to Survive A Robot Uprising. The book was so successful that Paramount Pictures optioned the rights. A screenplay was written, with Mike Myers attached to play the leading role. Then, as these things sometimes go, the project was shelved indefinitely. But Wilson kept writing. Where’s My Jetpack? A Guide to the Amazing Science Fiction Future that Never Arrived was published in 2007, followed by How to Build a Robot Army in 2008. Wilson’s first full-length fiction book, Robopocalypse, was published in 2011, making The New York Times bestseller list and attracting the attention of Steven Spielberg, who committed to direct a film adaptation currently in development at DreamWorks. Since then, Wilson has written over a dozen short stories, comic books, and graphic novels, most of them drawing on his knowledge of robotics and artificial intelligence. His most recent book, Robogenesis, was published last year by Doubleday. Although a successful writing career is nothing to scoff at, Wilson jokes that he should have stuck to smart environments—companies like Nest Labs are now using sensor-driven, self-learning systems to design thermostats and smoke detectors. Last January, Google acquired Nest Labs for $3.2 billion.

Last year, Wilson put writing aside momentarily to turn his attention back to robotics. He enlisted the help of a mobile game and app design studio in Portland, where he lives, to create a prototype for an iOS app that made use of Apple’s speech recognition technology; the prototype was so good that Wilson decided to turn it into an adventure game called Mayday! Deep Space. The game, launched on the Apple Store last week, is less ambitious than his previous work, but arguably more innovative: Players communicate with the game’s protagonist, a lone survivor on a ship stranded in space after a virus outbreak onboard, using a series of voice commands. In that sense, it’s nothing like a traditional gameplay experience, save for a single button, which players hold/release according to when they want to speak to the survivor. The voice commands are simple—walk forward, turn left, turn right, run, and so on—but the survivor, voiced by Supernatural’s Osric Chau, understands them, and answers back. The goal is to verbally guide the survivor through the ship, helping him to avoid obstacles and infected crew members.

Wilson was partly inspired by the scene in the film Aliens, where Ripley and Gorman monitor the marines’ movements from inside an armored personnel carrier using a radar screen and radio communication. When the marines come under attack, Ripley and Gorman can only hear what’s happening over the radio—the rest is left to their imagination. Mayday provides just enough stimuli to allow players’ imaginations to fill in the blanks. As we know from watching movie adaptations of books, what we conjure up in our mind can often outshine the efforts of a good director and $100 million in special effects. The fact that Mayday requires players to be alone, in a quiet place, where they can talk to their phone and hear it talk back, further prompts this imaginative thinking. I found myself getting annoyed with the survivor when he didn’t run fast enough, and, despite the fact that I knew he couldn’t recognize the command “Run faster, you jerk!”—I still said it, more than once. I also found myself verbally responding to what he was saying at various points in the game, even though I knew he couldn’t hear me. This dynamic was by design, Wilson told me.

“The very act of speaking out loud to someone, even if you know that person is not real, creates an instant connection,” Wilson said. “Somewhere deep in the recesses of your brain, you believe this person is real. Whether you want to or not, speaking out loud causes you to build a relationship with whoever, or whatever, you’re talking to.”

Robots and artificial intelligence systems have a distinct advantage over other technology because they can leverage our innate interest in other social beings. We’re drawn to things and people we can relate to, which is why we sometimes see faces in clouds, or in tortilla chips. Even though we know an artificial intelligence like Apple’s Siri is not human, the fact that she can stand in for human contact and answer our questions, sometimes in a surprising and humorous way, is satisfying and intriguing, regardless of the knowledge that it was a human who programmed her to say those things.

Courtesy of Daniel Wilson

Autonomy, however rudimentary, is another reason we find artificial intelligence so appealing. In Mayday, players can guide the survivor and tell him where to go, but he reserves the right to refuse to follow a direction if he thinks he’s in danger. Think of it like riding a horse: You can steer the horse in the right direction, but you can’t make it run off a cliff. At some point, the animal’s self-preservation instinct will kick in. Mayday goes a step further by asking players to make moral decisions on behalf of the survivor, sometimes without his knowledge. If he is autonomous, then the player is too.

This is a new way to think about games. Most contemporary games require players to put themselves in the shoes of the protagonist, to aid in the illusion that the fictional world the protagonist inhabits is real. But that doesn’t always work. “In video games it’s not uncommon to see a character running in place, forehead pressed against a wall,” Wilson told me. “It goes where you tell it, like a puppet, without any agency. It’s enough to take you out of the experience."

Like any animal, human or not, the Mayday survivor’s instinct is to live. If you tell him to do something that goes against that, he’ll kindly tell you to shove off. (“I’m not doing that.”) Eventually, Wilson wants to update the game with more advanced voice recognition software that will allow the player and the protagonist to communicate with emotional responses.

“What will it mean to inspire an AI character who is afraid, or to console a character who is grieving? Imagine the stories we'll tell when our characters become real enough to talk to.”

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.