Homeland Insecurity

A top expert says America's approach to protecting itself will only make matters worse. Forget "foolproof" technology—we need systems designed to fail smartly
More

To stop the rampant theft of expensive cars, manufacturers in the 1990s began to make ignitions very difficult to hot-wire. This reduced the likelihood that cars would be stolen from parking lots—but apparently contributed to the sudden appearance of a new and more dangerous crime, carjacking.

After a vote against management Vivendi Universal announced earlier this year that its electronic shareholder-voting system, which it had adopted to tabulate votes efficiently and securely, had been broken into by hackers. Because the new system eliminated the old paper ballots, recounting the votes—or even independently verifying that the attack had occurred—was impossible.

To help merchants verify and protect the identity of their customers, marketing firms and financial institutions have created large computerized databases of personal information: Social Security numbers, credit-card numbers, telephone numbers, home addresses, and the like. With these databases being increasingly interconnected by means of the Internet, they have become irresistible targets for criminals. From 1995 to 2000 the incidence of identity theft tripled.

As was often the case, Bruce Schneier was thinking about a really terrible idea. We were driving around the suburban-industrial wasteland south of San Francisco, on our way to a corporate presentation, while Schneier looked for something to eat not purveyed by a chain restaurant. This was important to Schneier, who in addition to being America's best-known ex-cryptographer is a food writer for an alternative newspaper in Minneapolis, where he lives. Initially he had been sure that in the crazy ethnic salad of Silicon Valley it would be impossible not to find someplace of culinary interest—a Libyan burger stop, a Hmong bagelry, a Szechuan taco stand. But as the rented car swept toward the vast, amoeboid office complex that was our destination, his faith slowly crumbled. Bowing to reality, he parked in front of a nondescript sandwich shop, disappointment evident on his face.

Web-only Sidebar:

"A Primer on Public-key Encryption"
Charles Mann explains public-key encryption and traces its history.

Schneier is a slight, busy man with a dark, full, closely cropped beard. Until a few years ago he was best known as a prominent creator of codes and ciphers; his book Applied Cryptography (1993) is a classic in the field. But despite his success he virtually abandoned cryptography in 1999 and co-founded a company named Counterpane Internet Security. Counterpane has spent considerable sums on advanced engineering, but at heart the company is dedicated to bringing one of the oldest forms of policing—the cop on the beat—to the digital realm. Aided by high-tech sensors, human guards at Counterpane patrol computer networks, helping corporations and governments to keep their secrets secret. In a world that is both ever more interconnected and full of malice, this is a task of considerable difficulty and great importance. It is also what Schneier long believed cryptography would do—which brings us back to his terrible idea.

"Pornography!" he exclaimed. If the rise of the Internet has shown anything, it is that huge numbers of middle-class, middle-management types like to look at dirty pictures on computer screens. A good way to steal the corporate or government secrets these middle managers are privy to, Schneier said, would be to set up a pornographic Web site. The Web site would be free, but visitors would have to register to download the naughty bits. Registration would involve creating a password—and here Schneier's deep-set blue eyes widened mischievously.

People have trouble with passwords. The idea is to have a random string of letters, numbers, and symbols that is easy to remember. Alas, random strings are by their nature hard to remember, so people use bad but easy-to-remember passwords, such as "hello" and "password." (A survey last year of 1,200 British office workers found that almost half chose their own name, the name of a pet, or that of a family member as a password; others based their passwords on the names Darth Vader and Homer Simpson.) Moreover, computer users can't keep different passwords straight, so they use the same bad passwords for all their accounts.

Many of his corporate porn surfers, Schneier predicted, would use for the dirty Web site the same password they used at work. Not only that, many users would surf to the porn site on the fast Internet connection at the office. The operators of Schneier's nefarious site would thus learn that, say, "Joesmith," who accessed the Web site from Anybusiness.com, used the password "JoeS." By trying to log on at Anybusiness.com as "Joesmith," they could learn whether "JoeS" was also the password into Joesmith's corporate account. Often it would be.

"In six months you'd be able to break into Fortune 500 companies and government agencies all over the world," Schneier said, chewing his nondescript meal. "It would work! It would work—that's the awful thing."

During the 1990s Schneier was a field marshal in the disheveled army of computer geeks, mathematicians, civil-liberties activists, and libertarian wackos that—in a series of bitter lawsuits that came to be known as the Crypto Wars—asserted the right of the U.S. citizenry to use the cryptographic equivalent of kryptonite: ciphers so powerful they cannot be broken by any government, no matter how long and hard it tries. Like his fellows, he believed that "strong crypto," as these ciphers are known, would forever guarantee the privacy and security of information—something that in the Information Age would be vital to people's lives. "It is insufficient to protect ourselves with laws," he wrote in Applied Cryptography. "We need to protect ourselves with mathematics."

Schneier's side won the battle as the nineties came to a close. But by that time he had realized that he was fighting the wrong war. Crypto was not enough to guarantee privacy and security. Failures occurred all the time—which was what Schneier's terrible idea demonstrated. No matter what kind of technological safeguards an organization uses, its secrets will never be safe while its employees are sending their passwords, however unwittingly, to pornographers—or to anyone else outside the organization.

The Parable of the Dirty Web Site illustrates part of what became the thesis of Schneier's most recent book, Secrets and Lies (2000): The way people think about security, especially security on computer networks, is almost always wrong. All too often planners seek technological cure-alls, when such security measures at best limit risks to acceptable levels. In particular, the consequences of going wrong—and all these systems go wrong sometimes—are rarely considered. For these reasons Schneier believes that most of the security measures envisioned after September 11 will be ineffective, and that some will make Americans less safe.

It is now a year since the World Trade Center was destroyed. Legislators, the law-enforcement community, and the Bush Administration are embroiled in an essential debate over the measures necessary to prevent future attacks. To armor-plate the nation's security they increasingly look to the most powerful technology available: retina, iris, and fingerprint scanners; "smart" driver's licenses and visas that incorporate anti-counterfeiting chips; digital surveillance of public places with face-recognition software; huge centralized databases that use data-mining routines to sniff out hidden terrorists. Some of these measures have already been mandated by Congress, and others are in the pipeline. State and local agencies around the nation are adopting their own schemes. More mandates and more schemes will surely follow.

Schneier is hardly against technology—he's the sort of person who immediately cases public areas for outlets to recharge the batteries in his laptop, phone, and other electronic prostheses. "But if you think technology can solve your security problems," he says, "then you don't understand the problems and you don't understand the technology." Indeed, he regards the national push for a high-tech salve for security anxieties as a reprise of his own early and erroneous beliefs about the transforming power of strong crypto. The new technologies have enormous capacities, but their advocates have not realized that the most critical aspect of a security measure is not how well it works but how well it fails.

The Crypto Wars

If mathematicians from the 1970s were suddenly transported through time to the present, they would be happily surprised by developments such as the proofs to Kepler's conjecture (proposed in 1611, confirmed in 1998) and to Fermat's last theorem (1637, 1994). But they would be absolutely astonished by the RSA Conference, the world's biggest trade show for cryptographers. Sponsored by the cryptography firm RSA Security, the conferences are attended by as many as 10,000 cryptographers, computer scientists, network managers, and digital-security professionals. What would amaze past mathematicians is not just the number of conferences but that they exist at all.

Why the Maginot Line Failed

In fact, the Maginot Line, the chain of fortifications on France's border with Germany, was indicative neither of despair about defeating Germany nor of thought mired in the past. It was instead evidence of faith that technology could substitute for manpower. It was a forerunner of the strategic bomber, the guided missile, and the "smart bomb." The same faith led to France's building tanks with thicker armor and bigger guns than German tanks had, deploying immensely larger quantities of mobile big guns, and above all committing to maintain a continuous line—that is, advancing or retreating in such coordination as to prevent an enemy from establishing a salient from which it could cut off a French unit from supplies and reinforcements. (Today, military strategists call this "force protection.") But having machines do the work of men and putting emphasis on minimal loss of life carried a price in slowed-down reaction times and lessened initiative for battlefield commanders. —Ernest R. May, Strange Victory: Hitler's Conquest of France (2000)

Cryptology is a specialized branch of mathematics with some computer science thrown in. As recently as the 1970s there were no cryptology courses in university mathematics or computer-science departments; nor were there crypto textbooks, crypto journals, or crypto software. There was no private crypto industry, let alone venture-capitalized crypto start-ups giving away key rings at trade shows (crypto key rings—techno-humor). Cryptography, the practice of cryptology, was the province of a tiny cadre of obsessed amateurs, the National Security Agency, and the NSA's counterparts abroad. Now it is a multibillion-dollar field with applications in almost every commercial arena.

As one of the people who helped to bring this change about, Schneier is always invited to speak at RSA conferences. Every time, the room is too small, and overflow crowds, eager to hear their favorite guru, force the session into a larger venue, which is what happened when I saw him speak at an RSA conference in San Francisco's Moscone Center last year. There was applause from the hundreds of seated cryptophiles when Schneier mounted the stage, and more applause from the throng standing in the aisles and exits when he apologized for the lack of seating capacity. He was there to talk about the state of computer security, he said. It was as bad as ever, maybe getting worse.

In the past security officers were usually terse ex-military types who wore holsters and brush cuts. But as computers have become both attackers' chief targets and their chief weapons, a new generation of security professionals has emerged, drawn from the ranks of engineering and computer science. Many of the new guys look like people the old guard would have wanted to arrest, and Schneier is no exception. Although he is a co-founder of a successful company, he sometimes wears scuffed black shoes and pants with a wavering press line; he gathers his thinning hair into a straggly ponytail. Ties, for the most part, are not an issue. Schneier's style marks him as a true nerd—someone who knows the potential, both good and bad, of technology, which in our technocentric era is an asset.

Schneier was raised in Brooklyn. He got a B.S. in physics from the University of Rochester in 1985 and an M.S. in computer science from American University two years later. Until 1991 he worked for the Department of Defense, where he did things he won't discuss. Lots of kids are intrigued by codes and ciphers, but Schneier was surely one of the few to ask his father, a lawyer and a judge, to write secret messages for him to analyze. On his first visit to a voting booth, with his mother, he tried to figure out how she could cheat and vote twice. He didn't actually want her to vote twice—he just wanted, as he says, to "game the system."

Unsurprisingly, someone so interested in figuring out the secrets of manipulating the system fell in love with the systems for manipulating secrets. Schneier's childhood years, as it happened, were a good time to become intrigued by cryptography—the best time in history, in fact. In 1976 two researchers at Stanford University invented an entirely new type of encryption, public-key encryption, which abruptly woke up the entire field.

Public-key encryption is complicated in detail but simple in outline. All ciphers employ mathematical procedures called algorithms to transform messages from their original form into an unreadable jumble. (Cryptographers work with ciphers and not codes, which are spy-movie-style lists of prearranged substitutes for letters, words, or phrases—"meet at the theater" for "attack at nightfall.") Most ciphers use secret keys: mathematical values that plug into the algorithm. Breaking a cipher means figuring out the key. In a kind of mathematical sleight of hand, public-key encryption encodes messages with keys that can be published openly and decodes them with different keys that stay secret and are effectively impossible to break using today's technology. (A more complete explanation of public-key encryption is available here.)

The best-known public-key algorithm is the RSA algorithm, whose name comes from the initials of the three mathematicians who invented it. RSA keys are created by manipulating big prime numbers. If the private decoding RSA key is properly chosen, guessing it necessarily involves factoring a very large number into its constituent primes, something for which no mathematician has ever devised an adequate shortcut. Even if demented government agents spent a trillion dollars on custom factoring computers, Schneier has estimated, the sun would likely go nova before they cracked a message enciphered with a public key of sufficient length.

Schneier and other technophiles grasped early how important computer networks would become to daily life. They also understood that those networks were dreadfully insecure. Strong crypto, in their view, was an answer of almost magical efficacy. Even federal officials believed that strong crypto would Change Everything Forever—except they thought the change would be for the worse. Strong encryption "jeopardizes the public safety and national security of this country," Louis Freeh, then the director of the (famously computer-challenged) Federal Bureau of Investigation, told Congress in 1995. "Drug cartels, terrorists, and kidnappers will use telephones and other communications media with impunity knowing that their conversations are immune" from wiretaps.

The Crypto Wars erupted in 1991, when Washington attempted to limit the spread of strong crypto. Schneier testified before Congress against restrictions on encryption, campaigned for crypto freedom on the Internet, co-wrote an influential report on the technical snarls awaiting federal plans to control cryptographic protocols, and rallied 75,000 crypto fans to the cause in his free monthly e-mail newsletter, Crypto-Gram. Most important, he wrote Applied Cryptography, the first-ever comprehensive guide to the practice of cryptology.

Washington lost the wars in 1999, when an appellate court ruled that restrictions on cryptography were illegal, because crypto algorithms were a form of speech and thus covered by the First Amendment. After the ruling the FBI and the NSA more or less surrendered. In the sudden silence the dazed combatants surveyed the battleground. Crypto had become widely available, and it had indeed fallen into unsavory hands. But the results were different from what either side had expected.

Jump to comments
Presented by

Charles C. Mann, an Atlantic contributing editor, has been writing for the magazine since 1984. His recent books include 1491, based on his March 2002 cover story, and 1493.

Get Today's Top Stories in Your Inbox (preview)

'Stop Telling Women to Smile'

An artist's campaign to end sexual harassment on the streets of NYC.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

More back issues, Sept 1995 to present.

Just In