The dress code said it all.
When Admiral Michael S. Rogers, director of the National Security Agency, Cyber Command Commander, and recipient of the Navy Distinguished Service Medal recently walked into a cybersecurity conference, his uniform bore 20 ribbons and four badges from his esteemed Navy career. Rogers’ hair was neat and precise, in full compliance with Navy regulation on grooming standards for personal appearance.
His keynote followed the first panel, who wore jeans, cardigans, and button-down shirts with rolled-up sleeves. Kevin Bankston, policy director of the Open Technology Institute, quipped that the “answer to cybersecurity is letting people wear hoodies in D.C.” With a long ponytail and full grey beard, Bruce Schneier—known as one of the world’s best cryptographers—was, for instance decidedly not in compliance with Navy Uniform regulation.
These two cultures have to come together to bridge the twin goals of keeping us safe from online threat, while preserving our liberties online. Bringing the uniforms and the hoodies into one conversation was both a reflection of the challenge, and a call for solutions.
Both Schneier and Alex Stamos, chief information security officer for Yahoo, sparred with Rogers over the issue of building backdoors into source code that would grant government access to important information that technology companies have been resistant to share. Apple and Google recently announced that its software would encrypt all data so the government can’t access it, even if the companies were presented with a warrant.
“It sounds like you agree with [FBI] Director Comey that we should be building defects into the encryption in our products,” said Stamos, who compared backdoors to “drilling a hole in the windshield,” which would leave code dangerously vulnerable to malicious hacking.
“That would be your characterization,” Rogers shot back.
If this sound like an adversarial exchange, that’s because it was, emblematic of the tension between the government—uniforms—and technology companies—hoodies. When the White House hosted a cybersecurity summit this month at Stanford University, for example, Apple CEO Tim Cook gave a blistering defense of user privacy. “History has shown us that sacrificing our right to privacy can have dire consequences,” Cook said.
This conference served as the launch of a new Cybersecurity Initiative, which combines expertise from a multiple fields to address the challenges posed by cyber threats. One point of agreement among many of the technologists, coders, and tech executives—not to mention audience members—was that the threats faced by both the NSA and tech companies don’t live in a vacuum. They are shared, interdependent, and from both state and non-state actors.
And, the next step may be not to just get the uniforms and hoodies into one room, but to bring more technologists into policy work.
“If you look at the leadership positions of cybersecurity, they are not the people who have actually done this,” said Stamos*. “It would be nice to see more technologists do more of these [policy] things.”
And in addition to the diversity is who is creating policy, there needs to be greater diversity in who designs products, Tara Whalen, a privacy analyst at Google, pointed out. Low-income communities are more likely to use products that are not designed to their specific needs. You have to “design with people and not just for people. You want to hear from the users and not just the people who feel they know that the users want,” she said.
Because cybersecurity represents an increasingly urgent threat to governments and businesses alike, new capabilities are required to meet them. Assistant Attorney General John Carlin announced during his panel that he would consider indicting those who assist with ISIS social media efforts. “It is a new way to propagandize and reach individuals in a very targeted fashion in their home, and the ability to produce the slick propaganda is cheap and widely available,” Carlin said. “It presents a new threat.”
The phenomenon of ISIS using social media to recruit foreign fighters also illustrates a broader problem with the Internet. While the Internet has democratized, informed, and made us laugh, it has also served to silo us off from each other. We can surround ourselves in an echo chamber we hear only our own opinions are heard. So if the adversarial exchanges between Rogers and technologists served a greater purpose, it was simply to confront and engage the other side.
“Rarely do we have a conversation on cybersecurity that engages in tough questions,” reflected New America president and CEO Anne-Marie Slaughter. Yet throughout the conference, it was apparent that many of these questions needed answers. Is a cyber-attack a declaration of war? What are the norms around cyber-attacks? How should a state respond?
Rogers compared the current cybersecurity landscape to the first 10 years in the debate about nuclear deterrence. In his view, we are living in a kind of wild west, where ideas that will shape our strategic thinking are still being formed. He suggested that the answers to these questions might come from the academic community, similar to the nuclear deterrence theories of Henry Kissinger and Thomas Schelling. “There is a place in the academic world for this kind of discussion [when it comes to cyber],” he said.
Yet the analogy between cybersecurity and nuclear deterrence might not be so straightforward. Schneier argued a technological approach to encryption was more important than legal approach. Rogers disagreed, contending that a legal approach to encryption was more important. This rift is problematic because the way each side frames the issue leads them to different policy outcomes.
To answer all of these questions, and to improve usability and cybersecurity in particular, may require diverse thinking. The challenges that the Internet presents will not be straightforward, but rather asymmetric. We will not know when or where the next cyber attack will take place, or in which dorm room the next billion-dollar company is launched. Every computer is a tool capable good or bad.
Unless it’s a Seth Rogen movie. Then it’s a tool for evil.
* An earlier version of this article mistakenly attributed this quote to Cheri Mcguire. The person who should have been quoted is Alex Stamos. We regret the error.
This post appears courtesy of New America's Weekly Wonk magazine.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.