Currently, new iPhones and iPads are equipped with “full-disk” encryption, a technology that scrambles the entirety of a device’s contents. The only way to decode the data is with a combination of two keys: a unique hardware key bundled with every device, and a passcode chosen by a user.
This powerful method of encryption has stymied even the FBI, which is now asking Apple for help as it tries to break into an iPhone 5c that was in the possession of Syed Farook, one of the shooters that killed 14 people and injured 22 people in the San Bernardino attack.
But there’s a catch: Although all the data on Farook’s phone is locked up by his iPhone—inaccessible even to Apple—the company can easily retrieve any information that the phone sent to the company’s servers in the form of routine iCloud backups.
The government said Farook’s phone stopped sending data to iCloud several weeks before the attack, suggesting that he manually turned off the setting. If he hadn’t, the FBI would not need to get into the phone at all, or even know where it is: Law enforcement can use subpoenas to compel Apple to turn over data from iCloud backups on its servers.
According to Apple’s latest transparency report, the company turned over content in 295 different iCloud accounts in response to government requests in the first half of 2015. Included in iCloud backups are emails, iMessages, photos, contacts, calendar events, notes, reminders, Safari history, and passwords.
Apple’s ability to dip into iCloud backups is, in essence, a security hole. The company’s lawyers have argued that writing code that would help the FBI amounts to opening up a “back door” that criminals and hackers could exploit in the future. If criminals can steal that code, Apple says, they’d be able to break into nearly any iPhone or iPad. But the same vulnerability holds for iCloud data, which could be accessed by an intruder who gains access to Apple’s systems.
The change Apple seems to be considering would put iCloud data beyond the reach even of its own employees. If the police asked Apple for someone’s iCloud backups, the company could only hand over encrypted data that’s virtually worthless without the right keys to decrypt it. The data could only be decrypted by the Apple user backing up his or her own data.
But there’s a reason Apple still hangs on to the keys for its customers’ backups: If an iCloud user forgets their password, Apple can act as a locksmith and let the user back in—once it verifies his or her identity—and help the user change the locks with a new password. If Apple were to encrypt iCloud data in such a way that even it can’t unlock it, a lost password would leave the data encrypted and unusable forever.
Consider again the sorts of information that iCloud houses: For many, it can be the only backup for a painstakingly assembled Rolodex, reams of valuable digital notes, or a lifetime of family photos. That’s a lot to lose in a single senior moment, and it leaves users who have chosen complex, hard-to-remember passwords especially vulnerable.