Marina just RT’d these new findings from Pew:
Black Lives Matter activist DeRay Mckesson (whom Clare interviewed last week about his mayoral campaign in Baltimore) expressed strong views today under the hashtag #DontHackApple. The gist of his tweetstorm:
When I was arrested in protest, my iPhones were in police custody. They were secure. The police couldn't access my info. #DontHackApple— deray mckesson (@deray) February 22, 2016
Kaveh has the latest on the ongoing saga:
In a very simply worded and direct letter posted to the Lawfare blog Sunday, [FBI Director James] Comey framed his controversial request as a routine effort to perform due diligence, while taking a moral stand on the San Bernardino investigation. “We can't look the survivors in the eye, or ourselves in the mirror, if we don't follow this lead,” Comey wrote. He called on “folks” to “take a deep breath,” and remember the terrorist attack that set off the probe.
Apple countered Monday morning with a frequently-asked-questions page that broke down, in a similarly straightforward way, the facts of the case and its argument for resisting the FBI’s request. And in an email to Apple employees that was shared with reporters, Cook wrote of receiving messages of support from “thousands of people in all 50 states.”
A reader notes:
There was an NPR piece today about how Apple apparently complied with this type of request 70 times prior to last October before some judge made it a public issue, after which they refused to comply in that case and in the current shooting-related case. Although it noted that the 70 times were with older software which was easier to exploit to unlock phones, and the new generation of phones would require this supposed software that doesn’t exist, to be created.
Another reader sounds off:
I think both sides of this debate have pretty valid arguments, even if unintentionally. On the FBI’s side, the old world of thinking is that if the terrorist had used pen and paper to write all this stuff down, they’d be able to conduct their search without having to involve a third party, as they are having to do now.
On the other hand, Apple’s got a pretty legitimate argument this is a slippery slope. Even if what the FBI is asking is reasonable this time, can Apple really trust the government not to abuse this backdoor with the FISA courts already in place or not to pass another more extensive version of the Patriot Act? Also, I get that Apple may have done this before, but they have to draw the line somewhere. This seems like a good place to start the debate.
Update from a reader:
Would-be terrorists and privacy fanatics beware: Don't switch your phone from a pattern or PIN unlock to a fingerprint unlock:
. . . a criminal defendant can be compelled to give up his fingerprint and unlock his cellphone in the course of a criminal investigation — because that's just like handing in a DNA sample or a physical key, which citizens can already be legally compelled to give to police. On the other hand, police can’t force a defendant to give up his passcode, because that's considered "knowledge" — not a physical object — and knowledge is protected by the Fifth Amendment.
A fingerprint can be obtained from a corpse as well, or from things one has touched—including the phone itself. Further, you can always change your pattern or PIN, but good luck trying to change your fingerprints if they leak out. I doubt experts would have too much trouble mimicking a live finger with a duplicated print.
It seems to me that Apple needs to make privacy a big selling point. Its services are not as good as Google’s due to how much more Google can know about you. Google Now on Android (for someone who uses many Google services, such as Gmail and Chrome), is amazingly good, because its computers know such a person so very well. Apple has to emphasize the importance of privacy to compete. I’m all in with Google because I have no problem with Google and the government knowing about all the boring stuff I do. And Google Now makes my life so much better on a day-to-day basis. I’m willing to be an open book, but people unlike me are giving up something significant for all that privacy, and Apple needs to come through for them.
The reader follows up:
Perhaps I should have stuck this quote from The New Yorker somewhere in the below:
From the surface, Google’s approach seems superior. Understanding context is all about data, and the company is collecting a lot more of it. Apple has your phone; Google has access to almost everything. Google’s approach might lack humanness, but the company will make up for that with accuracy and convenience. Apple’s approach will appeal to those for whom privacy is important. For now, that argument will resonate in parts of the United States and in most of Europe, while the rest of the planet will opt for a cheaper, more convenient, and, in the end, smarter system from Google.
And maybe this as well:
In terms of pure technology, Google is flying circles around Apple. There's a strong case to be made that Apple's focus on user design and experience makes it the manufacturer of the superior smartphone. But when it comes to stuff like facial recognition, virtual reality, driverless cars, or even less esoteric stuff like maps and directions — there’s no argument. Google is ahead of the pack, just about every time.
All of that stuff that Google is so good at, however, involves the judicious use of "machine learning," or technology that gets smarter over time. To make the cognitive wheels of machine learning turn, you need huge amounts of data, applied systematically to a complicated mathematical model. . . . So this focus on privacy is also self-serving for Apple: It takes a weakness — the fact that it's not as good as Google at collecting data to build great services — and turns it into a strength that helps it sell its most important product.