Apple’s Next Security Update May Not Be User-Friendly

If the company encrypts iCloud backups, users would likely lose access to their valuable photos and contacts if they forget their password.

New York police officers stand outside the Apple Store on Fifth Avenue while monitoring a demonstration last week. (Julie Jacobson / AP)

As Apple and the FBI face off in an increasingly public legal battle, the company is looking for more ways to keep its customers’ data safe. Apple executives have said the company is developing new security measures, which could include a change that would render users’ information inaccessible even to its employees. But that change could make Apple products less easy to use—at a time when the company is being criticized for the increasing complexity of its software.

Until now, the changes that Apple has incorporated into successive versions of its hardware and software have largely stayed out of the user’s way. An update to the iOS operating system in 2014 encrypted the entire contents of every compatible iPhone and iPad, while still allowing users to unlock their phones with just a four-digit passcode. And most recent Apple devices are equipped with fingerprint readers that free users from having to enter the passcode every time they unlock their devices.

But now, Apple is said to be considering a change that would significantly boost the security of user data—at a potentially high cost to usability. In briefings with reporters, Apple executives have hinted that the company is looking to make changes that would further strengthen security. According to reports in The New York Times and The Financial Times, that change would affect Apple’s own servers rather than users’ devices.

Currently, new iPhones and iPads are equipped with “full-disk” encryption, a technology that scrambles the entirety of a device’s contents. The only way to decode the data is with a combination of two keys: a unique hardware key bundled with every device, and a passcode chosen by a user.

This powerful method of encryption has stymied even the FBI, which is now asking Apple for help as it tries to break into an iPhone 5c that was in the possession of Syed Farook, one of the shooters that killed 14 people and injured 22 people in the San Bernardino attack.

But there’s a catch: Although all the data on Farook’s phone is locked up by his iPhone—inaccessible even to Apple—the company can easily retrieve any information that the phone sent to the company’s servers in the form of routine iCloud backups.

The government said Farook’s phone stopped sending data to iCloud several weeks before the attack, suggesting that he manually turned off the setting. If he hadn’t, the FBI would not need to get into the phone at all, or even know where it is: Law enforcement can use subpoenas to compel Apple to turn over data from iCloud backups on its servers.

According to Apple’s latest transparency report, the company turned over content in 295 different iCloud accounts in response to government requests in the first half of 2015. Included in iCloud backups are emails, iMessages, photos, contacts, calendar events, notes, reminders, Safari history, and passwords.

Apple’s ability to dip into iCloud backups is, in essence, a security hole. The company’s lawyers have argued that writing code that would help the FBI amounts to opening up a “back door” that criminals and hackers could exploit in the future. If criminals can steal that code, Apple says, they’d be able to break into nearly any iPhone or iPad. But the same vulnerability holds for iCloud data, which could be accessed by an intruder who gains access to Apple’s systems.

The change Apple seems to be considering would put iCloud data beyond the reach even of its own employees. If the police asked Apple for someone’s iCloud backups, the company could only hand over encrypted data that’s virtually worthless without the right keys to decrypt it. The data could only be decrypted by the Apple user backing up his or her own data.

But there’s a reason Apple still hangs on to the keys for its customers’ backups: If an iCloud user forgets their password, Apple can act as a locksmith and let the user back in—once it verifies his or her identity—and help the user change the locks with a new password. If Apple were to encrypt iCloud data in such a way that even it can’t unlock it, a lost password would leave the data encrypted and unusable forever.

Consider again the sorts of information that iCloud houses: For many, it can be the only backup for a painstakingly assembled Rolodex, reams of valuable digital notes, or a lifetime of family photos. That’s a lot to lose in a single senior moment, and it leaves users who have chosen complex, hard-to-remember passwords especially vulnerable.

The decrease in convenience and the potential for some very unhappy customers may lead Apple to make strong iCloud encryption an option only for particularly security-conscious users. Or it may even choose to forgo the feature altogether if it decides that the usability downsides are too extreme.

The difficulty of Apple’s decision reflects the enduring trade-off between usability and security. Even as developers are using good design to make security and privacy more accessible, there is a point at which better security comes at the expense of convenience.