Large-scale federal databases would undergo similar assaults. The prospect is worrying, given the government's long-standing reputation for poor information security. Since September 11 at least forty government networks have been publicly cracked by typographically challenged vandals with names like "CriminalS," "S4t4n1c S0uls," "cr1m3 0rg4n1z4d0," and "Discordian Dodgers." Summing up the problem, a House subcommittee last November awarded federal agencies a collective computer-security grade of F. According to representatives of Oracle, the federal government has been talking with the company about employing its software for the new central databases. But judging from the past, involving the private sector will not greatly improve security. In March, CERT/CC, a computer-security watchdog based at Carnegie Mellon University, warned of nineteen vulnerabilities in Oracle's database software. Meanwhile, a centerpiece of the company's international advertising is the claim that its software is "unbreakable." Other software vendors fare no better: CERT/CC issues a constant stream of vulnerability warnings about every major software firm.
Schneier, like most security experts I spoke to, does not oppose consolidating and modernizing federal databases per se. To avoid creating vast new opportunities for adversaries, the overhaul should be incremental and small-scale. Even so, it would need to be planned with extreme care—something that shows little sign of happening.
One key to the success of digital revamping will be a little-mentioned, even prosaic feature: training the users not to circumvent secure systems. The federal government already has several computer networks—INTELINK, SIPRNET, and NIPRNET among them—that are fully encrypted, accessible only from secure rooms and buildings, and never connected to the Internet. Yet despite their lack of Net access the secure networks have been infected by e-mail perils such as the Melissa and I Love You viruses, probably because some official checked e-mail on a laptop, got infected, and then plugged the same laptop into the classified network. Because secure networks are unavoidably harder to work with, people are frequently tempted to bypass them—one reason that researchers at weapons labs sometimes transfer their files to insecure but more convenient machines.
Remember Pearl Harbor
Surprise, when it happens to a government, is likely to be a complicated, diffuse, bureaucratic thing ... It includes gaps in intelligence, but also intelligence that, like a string of pearls too precious to wear, is too sensitive to give to those who need it. It includes the alarm that fails to work, but also the alarm that has gone off so often it has been disconnected. It includes the unalert watchman, but also the one who knows he'll be chewed out by his superior if he gets higher authority out of bed. It includes the contingencies that occur to no one, but also those that everyone assumes somebody else is taking care of. It includes straightforward procrastination, but also decisions protracted by internal disagreement. It includes, in addition, the inability of individual human beings to rise to the occasion until they are sure it is the occasion—which is usually too late. (Unlike movies, real life provides no musical background to tip us off to the climax.) Finally, as at Pearl Harbor, surprise may include some measure of genuine novelty introduced by the enemy, and possibly some sheer bad luck. The results, at Pearl Harbor, were sudden, concentrated, and dramatic. The failure, however, was cumulative, widespread, and rather drearily familiar. This is why surprise, when it happens to a government, cannot be described just in terms of startled people. Whether at Pearl Harbor or at the Berlin Wall, surprise is everything involved in a government's (or in an alliance's) failure to anticipate effectively. —Foreword by Thomas C. Schelling to Pearl Harbor: Warning and Decision (1962) by Roberta Wohlstetter
Schneier has long argued that the best way to improve the very bad situation in computer security is to change software licenses. If software is blatantly unsafe, owners have no such recourse, because it is licensed rather than bought, and the licenses forbid litigation. It is unclear whether the licenses can legally do this (courts currently disagree), but as a practical matter it is next to impossible to win a lawsuit against a software firm. If some big software companies lose product-liability suits, Schneier believes, their confreres will begin to take security seriously.
Computer networks are difficult to keep secure in part because they have so many functions, each of which must be accounted for. For that reason Schneier and other experts tend to favor narrowly focused security measures—more of them physical than digital—that target a few precisely identified problems. For air travel, along with reinforcing cockpit doors and teaching passengers to fight back, examples include armed uniformed—not plainclothes—guards on select flights; "dead-man" switches that in the event of a pilot's incapacitation force planes to land by autopilot at the nearest airport; positive bag matching (ensuring that luggage does not get on a plane unless its owner also boards); and separate decompression facilities that detonate any altitude bombs in cargo before takeoff. None of these is completely effective; bag matching, for instance, would not stop suicide bombers. But all are well tested, known to at least impede hijackers, not intrusive to passengers, and unlikely to make planes less secure if they fail.
It is impossible to guard all potential targets, because anything and everything can be subject to attack. Palestinian suicide bombers have shown this by murdering at random the occupants of pool halls and hotel meeting rooms. Horrible as these incidents are, they do not risk the lives of thousands of people, as would attacks on critical parts of the national infrastructure: nuclear-power plants, hydroelectric dams, reservoirs, gas and chemical facilities. Here a classic defense is available: tall fences and armed guards. Yet this past spring the Bush Administration cut by 93 percent the funds requested by the Energy Department to bolster security for nuclear weapons and waste; it denied completely the funds requested by the Army Corps of Engineers for guarding 200 reservoirs, dams, and canals, leaving fourteen large public-works projects with no budget for protection. A recommendation by the American Association of Port Authorities that the nation spend a total of $700 million to inspect and control ship cargo (today less than two percent of container traffic is inspected) has so far resulted in grants of just $92 million. In all three proposals most of the money would have been spent on guards and fences.