How Apple Sees the Near Future

Siri is set to play a much larger role in the company’s products.

Tim Cook, CEO of Apple, holds an iPad Pro after his keynote address at WWDC. (Stephen Lam / Reuters)

Without once saying the words “artificial intelligence,” a stream of Apple executives described a vision of the near future in which Siri, the company’s AI avatar, stitches together the company’s many hardware products.

And they introduced a new—and widely anticipated—entry into their lineup: a $349 cylindrical voice-controlled speaker they call HomePod.

After a strangely dystopian video in which Apple’s apps go away and the world plunges into post-apocalyptic violence, Apple CEO Tim Cook led off the company’s keynote at its big gathering for coders, the Worldwide Developers Conference, in San Jose.

The WWDC keynote tends to be a place where Apple showcases all the little incremental “refinements” they are making to their software and hardware. This year, however, there was a thread that ran through many presentations: Siri.

Through the demonstrations and talks, Apple’s vision for Siri became clearer: It is an all-purpose stand-in for predictive, helpful intelligence across all Apple devices. “Siri isn’t just a voice assistant,” said Craig Federighi, Apple’s senior VP of software engineering. “With Siri intelligence, it understands context. It understands your interests. It understands how you use your device. It understands what you want next.”

For example, Federighi said, imagine you’re planning a trip to Iceland. Siri might suggest stories about Iceland within the news app or even suggest the spelling for a difficult Icelandic place name. (Perhaps she’ll suggest some Björk for your HomePod, even.)

Even the Apple Watch has a new (and decidedly Google Now-like) face that guesses what information you might want to see on that tiny screen at any given time.

Siri suffuses all the Apple products now. It’s less a voice-UI gimmick than an organizational structure for how Apple thinks about proactive and reactive user assistance. Or, to put it slightly less generously, “Siri is turning into Watson, a generic brand for anything using simple machine learning,” tweeted Ben Cunningham, a former Facebook engineer.

The Apple presenters probably said machine learning two dozen times as they described their plans for iOS, the software that runs iPhones and iPads, watchOS, and the next version of macOS. They planned to roll out a new series of machine-learning tools for developers, which will allow app makers to access Apple’s computer vision and natural-language processing tools.

That kind of easy AI access was a theme of Google’s developer conference, too. But unlike Google and Amazon, Apple emphasized  the privacy features of their devices. For example, Amazon’s Echo speakers transmit some data to Amazon as they wait to hear the word “Alexa” and spring into action. But Apple’s HomePod will do that processing locally inside the speaker. “Until you say [‘Hey Siri’], nothing is being sent to Apple,” said Phil Schiller, Apple’s VP of marketing. And then, what is sent to the company’s servers is an “anonymous Siri ID.”

Amazon emailed me to clarify that they don't send data while the Echo is idling, waiting to be called upon. "When you use the wake word," a company spokesperson explained, "the audio stream includes a fraction of a second of audio before the wake word, and closes once your question or request has been processed."

It is true that Apple’s business model is far less dependent on amassing data about individual people than Google, Facebook, or Amazon. They sell stuff to people.

Or as analyst Horace Dediu summed up the Apple pitch: “Siri knows you. Apple doesn’t.”

Taken together with Google, Microsoft, Amazon, and Facebook’s pushes into this space, and it would seem that we’ll soon have a wide variety of systems that build and rebuild their models of your desires every moment, hoping they can provide just the right suggestion. It’ll be nudges all the way down.