"Snowden’s biggest single victim," a senior intelligence official told The New York Times, isn't his revelation of the National Security Agency's existing surveillance system. Instead, it's the NSA's push to broaden that ability to cover all of the scannable traffic on the internet. The public relations damage done to the agency has made an expansion of its ability to read data a non-starter. For now, anyway.
In a document released to coincide with the president's press conference last Friday, the NSA described the breadth of its existing data-scanning ability. As has been documented extensively, the agency monitors traffic on fiber-optic cables entering the United States (and elsewhere), monitoring it in near-real time for indicators of terror activity. But, it insists, it doesn't actually monitor that much.
So by its own assessment, the NSA can access ("touch") about 29.2 petabytes of traffic a day, reviewing about seven terabytes of that data. As author and professor Jeff Jarvis notes, this probably isn't just the NSA stumbling around, reviewing content. "Keep in mind that most of the data passing on the net is not email or web pages," he writes. "It's media." That media — videos and photos — isn't reviewed by the NSA. Messages and web traffic, text content, is. And a lot more text (and metadata) fits into that seven terabytes than photos. Seven terabytes is over a third of the contents of the Library of Congress, according to WolframAlpha.
[A]s he did in Iraq, [NSA head Keith] Alexander has pushed hard for everything he can get: tools, resources and the legal authority to collect and store vast quantities of raw information on American and foreign communications.
That has included an apparent push to expand the 1.6 percent to something more like 100 percent — in other words, to touch it all. The Times reports that the effort is central to Alexander's cyberwarfare efforts. To stop hostile attacks from hackers or foreign governments, the NSA wants to be able to see everything that's coming into the United States. The most generous analogy would be anti-virus software on your computer; by acting as a gateway to internet communication, it is meant to prevent any damage from being done. If you computer only checked a fraction of the traffic, it would be much less successful.
Under this proposal, the government would latch into the giant “data pipes” that feed the largest Internet service providers in the United States, companies like A.T.&T. and Verizon. The huge volume of traffic that runs through those pipes, particularly e-mails, would be scanned for signs of anything from computer servers known for attacks on the United States or for stealing information from American companies. Other “metadata” would be inspected for evidence of malicious software.
“It’s defense at network speed,” General Alexander told a Washington security-research group recently, according to participants. “Because you have only milliseconds.”
In the new political environment prompted by the Edward Snowden leaks, getting the infrastructure in place to do that analysis has become far less likely. Because, extending the analogy, it's as though your anti-virus software also had the power to tell the FBI when it found an email suspicious.
"But this summer," The Times notes, "the mood in Congress has changed." At this point, the NSA's push to become our national cyberwatchdog isn't likely to happen. At least until Congress is more confident we're not the ones who will be bitten.