Is Law Enforcement Crying Wolf About the Dangers of Locked Phones?

The examples put forward by FBI Director James Comey and his defenders are underwhelming.

Carolyn Kaster / AP

If the FBI takes the position that encrypted iPhones and other secure electronic devices pose a significant impediment to law enforcement, Susan Hennessey and Benjamin Wittes write at Lawfare, it is reasonable to demand that it does “more than cry wolf.” The FBI should “show us the cases in which the absence of extraordinary law enforcement access to encrypted data is actually posing a problem.”

And in the last couple weeks, the authors argue, the FBI “has shown some serious wolf.” First, they cite, FBI Director James Comey’s following testimony to Congress:

“A woman was murdered in Louisiana last summer, eight months pregnant, killed, no clue as to who did it, except her phone is there when she's found killed. They couldn't open it, still can't open it. So the case remains unsolved.” (The discussion is available here starting at 31:00.)

They add:

Then came the filing in the San Bernardino case this week. Note that this is a case that has a potentially serious ISIS link. The FBI has been sitting on one of the shooter’s phones for more than two months, unable to open it. It wants Apple’s help to determine “who [the shooters] may have communicated with to plan and carry out the IRC shootings, where Farook and Malik may have traveled to and from before and after the incident, and other pertinent information that would provide more information about their and others’ involvement in the deadline shooting.”

This is, in other words, a law enforcement and intelligence interest of the highest order: involving knowing for criminal justice purposes who may have been involved in an attack that took place within the United States and for prospective purposes who may be planning other such attacks.

For the Lawfare authors, these two cases are compelling evidence that strong security on consumer devices poses a serious enough problem to justify weakening device security for everyone.

In contrast, I am extremely underwhelmed.

Let’s begin with the murder victim in Louisiana.

For the first 230 years of U.S. history, police officers managed to investigate murders without the benefit of any evidence from smartphones belonging to the victim. Keep that baseline in mind, because any information that the cops get from the Louisiana victim’s device leaves them better off than all prior generations of law enforcement. And they presumably did get some useful information, because a locked device doesn’t prevent them from going to the phone company and getting access to call, text, and location data generated by the device.

It doesn’t stop the authorities from going to Uber to see when the victim last took a ride, to her cloud backup to see what was last uploaded, to her email client to see if she sent or received any messages through that medium, and on and on and on.

Admittedly, there could be something only on the device relevant to the murder. The victim could’ve recorded a quick voice memo. She could’ve jotted a note to herself. She could’ve snapped a photograph of the killer in the moment before he acted. But you’ve gotta think that the odds are against there being conclusive evidence on the phone that isn’t available elsewhere, not only because so much data is available in multiple places, but because the killer left the phone with the victim. If there was evidence of his (or her) identity on the device, the killer didn’t know it.

Comey says that “they couldn’t open it, still can’t open it. So the case remains unsolved.” But it is possible and seems likely that it won’t be solved even if they do open the phone. The fact that there’s an unsolved murder case where some  evidence might or might not be on a locked phone doesn’t seem like a compelling anecdote demonstrating the need to weaken security on everyone’s consumer devices. And if you’re worried that you’ll be killed and the lock on your smartphone will prevent the cops from finding your killer, by all means, disable the lock screen or leave a copy of your code in a safety deposit box. Everyone is free to decide when the costs of having strong security outweigh the benefits.

The San Bernardino case is much more compelling. Fourteen people were killed. And it’s possible that the iPhone in question has data that could lead to a terrorist collaborator.

Even so, there are many factors that make me underwhelmed by the San Bernardino case, too. The killers deliberately destroyed other devices but didn’t bother destroying this one. It was a work phone issued by San Bernardino County. Until some time prior to the killings it was backing up to iCloud, and the FBI was able to get those backups.

So once again, law enforcement personnel got information about some of what was on the phone. They’re missing a mere fraction of its contents. And one would guess that in the time between the last iCloud backup and the terrorist attack, Syed Farook did not suddenly start updating his work iPhone’s contacts with the addresses of his terrorist buddies, but refrained from calling or texting them (which the phone company could tell the FBI).

Nor is it likely that he left a voice memo with information the FBI would love to know.

Sure, you never know what clue might lurk on a single smartphone and nowhere else. I’d want to look inside that iPhone if I were the FBI. But the odds seem low that this locked smartphone contains a breakthrough in the recesses of its data that the FBI can't access.

In both of these cases, the FBI got a lot more data associated with these locked smartphones than it would have had in the pre-smartphone era. Yet it presents the cases as if the mere chance that there could be even more valuable information, however improbable, is a big problem for law enforcement in the age of encryption.

The truth is that despite the spread of encryption, law enforcement is living in a golden age of surveillance. In fact, the rapidly increasing capabilities of Big Brother pose a far greater threat to Americans than criminals or terrorists exploiting new ways to “go dark.” Acting surreptitiously is harder than ever in this world.

The final point to bear in mind is how little Americans will benefit if the FBI gets its way here. If iPhones are easy for the FBI to breach, the next San Bernardino shooter won’t just leave theirs on a table, blowing their whole network after an attack. They’ll abandon the iPhone, so that only non-terrorists are vulnerable to having their security breached; they’ll use less mainstream tools to encrypt their data; or they’ll “go dark” the old-fashioned way, by dropping their phone off a boat or tossing it off a bridge or pouring gasoline over the device and lighting a match.

All but the dumbest murderers and terrorists will adapt. And Americans will be left with dramatically less secure devices in exchange for infinitesimally more security. If these are the FBI’s best examples, bad guys “going dark” is a less-costly phenomenon than I had imagined.