Open Secrets

Why the so-called Clipper chip—vilified as a threat to the privacy of electronic communications—is not worth losing sleep over 

The political problem the Clinton Administration least expected was a war with America's high-tech community. Between Al Gore's patronage of the "information superhighway" and the Administration's various policies favoring the computer and semiconductor industries, Bill Clinton might have imagined that the Americans who build, buy, and use computers would be among his biggest fans.

Instead, since late last year electronic forums, which are the talk shows of the computer world, have rung with bitter denunciations of Clinton and Gore for having "betrayed" and "sold out" their high-tech constituency. The complaints have to do with two related proposals affecting privacy in the electronic age: for the use of a new encryption device called the Clipper chip, and for updating the law concerning wiretaps, through the Digital Telephony bill. The subjects of these proposals are highly technical, yet the debate about them has become as intense and polarized as those over school prayer and gun control. In one month this spring some 50,000 computer users endorsed a petition circulated on the Internet which denounced the Clipper-chip plan. "If this Clipper monster isn't stopped soon, we are going to witness the criminalization of anyone who insists on complete privacy of personal communications on computer networks," one wholly typical Internet message said. "Beating Clipper is a crucial step in freeing ourselves from all the dark possibilities of an authoritarian national-security state."

When I first heard about these plans, I was ready to go on the warpath alongside my fellow on-line habitués. But the more I have learned about the proposals, the less wrought-up I have become. The Clipper initiative, which has been the better publicized of the two plans, seems to me to be foolish and misguided—but also quite easily thwarted by those who most oppose it. The Digital Telephony proposal, in comparison, seems more reasonable and balanced. Looked at closely, neither is the watershed in the history of American freedoms that the electronic community has made it out to be.

Before going into the ugly to-and-fro over the Clipper proposal, let's consider the beautiful mathematical discoveries that have given rise to the whole debate.

The Clipper controversy concerns cryptography—putting messages into code. Codes have been around for about as long as people have been communicating, but until 1976 all codes shared one great flaw. Each relied on a secret "key," or coding method, with which the "plaintext" of the message could be encoded into "ciphertext" and then decoded into plaintext at the other end. The person sending the message had to agree beforehand with the person receiving it on what the key would be—references to pages in a book, a system of letter transposition, or one of countless other possibilities. The need for two or more people to agree ahead of time on a key created problems that have been the stuff of spy novels and war histories for hundreds of years: lost keys, stolen keys, counterfeit keys, keys that arrived after the message itself.

In 1976 Whitfield Diffie and Martin Hellman, both then at Stanford, wrote a paper proposing the radically different concept of a "public key" coding system, in which sender and receiver need never discuss a secret key. In 1978 Ronald Rivest, Adi Shamir, and Leonard Adleman, then all at the Massachusetts Institute of Technology, published a paper showing for the first time (at least in unclassified literature) exactly how a public-key system might work. This paper, which formulated what has come to be known as the RSA system, and the Diffie and Hellman paper before it are routinely referred to as being the "most innovative" or "most profound" steps in modern cryptography, because they eliminated the vulnerability that had afflicted all previous codes.

The basic mathematical idea behind public-key systems is that of the "one-way function." One-way functions are those that are much easier to perform in one direction than the other. Most people can quickly figure out, with a calculator or just a pencil, what 76, or 7 x 7 x 7 x 7 x 7 x 7, equals. Very few people can quickly figure out what the sixth root of 117,649 is. (It's 7.)

The same principle applies even to powerful computers, with allowances for their far greater speed. They can add, subtract, and multiply numbers much more quickly than they can calculate a number's factors. The simplest calculator can determine almost instantaneously that 987 x 1,013 = 999,831. But if a computer starts with the number 999,831, it will take time for the machine to determine, through trial and error, that the number can be evenly divided by 987 and 1,013.

As numbers become larger, this period of time becomes significant. When numbers become very large, the slow part of a one-way function becomes so slow that completing it is virtually impossible. For example: if two prime numbers, each 100 digits long, are multiplied together, the result will be a number about 200 digits long. A computer can easily calculate that enormous product. But if it starts with the same 200-digit number and tries to find what those 100-digit factors are, the process will be much slower. (As a reminder: a prime number has only two factors, 1 and the number itself. Therefore a 200-digit number that is the product of two prime numbers will have only four factors: 1, the number itself, and the two prime numbers.) Finding the factors of such a 200-digit number on a modern top-speed supercomputer would require not milliseconds or minutes but at least several centuries. The task is, in computer terms, "computationally infeasible."

What Diffie, Hellman, and the three MIT authors proposed was that numbers hidden by computational infeasibility could serve as a kind of secret key in a variety of ways. The key is not really secret, of course, since with infinite time an outsider could calculate what the hidden numbers were. But no outsider would live long enough to see the results, so in practical terms the code is secure. In the RSA system, for example, messages can be decoded by anyone who knows the two secret 100-digit prime numbers that were multiplied together to produce each recipient's 200-digit public key. (In the RSA system, to simplify slightly, each person chooses his own secret prime numbers, multiplies them together, and reveals the product as his "public key." Anyone wanting to send a message to that person applies this public key in an encryption formula. The original prime numbers are then used in a decryption formula to produce the original message.) Only the intended recipient knows the original prime numbers; no one else can figure them out from the public key.

For the world's cryptographers, the public-key concept was exhilarating. For the world's governments, its implications were more complicated. Public-key cryptography raised for the first time the prospect that completely secure communications systems might be available to almost anyone—including parties whose conversations a government would like to overhear. Various U.S. government agencies have long been confident of their ability to break a code if they really need to. Now officials became concerned that as public-key coding programs became cheap and widely available, the government would lose its ability to hear what other governments, and in certain cases its own citizens, were saying.

Two branches of the U.S. government were most immediately concerned: the Federal Bureau of Investigation, which has relied on wiretaps for domestic law enforcement, and feared that criminals would start scrambling all transmissions; and the National Security Agency, which exists in part for the purpose of intercepting telephone and broadcast transmissions around the world. (By law the NSA, like the Central Intelligence Agency, is not supposed to intercept data within the United States. Because the NSA's operations are more secret than those of any other major government agency, there is no way to be sure that it has followed this rule.)

The NSA, which arguably controls the greatest concentration of computing power in America, was the main bureaucratic force behind the Clipper chip. It feared that sooner or later public-key coding systems would become cheap and commercially available, so that citizens and businesses would routinely use them to scramble their telephone calls and data transmissions. This in turn would make it nearly impossible for the government to decode messages it intercepted; not even the NSA's computers could break the public keys. To forestall this possibility, the NSA, with the eventual support of the FBI, threw its weight behind the Clipper proposal—for a government-approved encryption system that might prevent the spread of public-key codes. Early last year the Clinton Administration embraced this plan.

The Clipper chip, officially known as the MYK-78 device, is programmed by Mykotronx Inc., of California. Telephones and modems that meet the Clipper standard come with this chip built in. Clipper phones scramble messages not according to public-key systems but through a secret coding algorithm called Skipjack. Skipjack would work with several "keys," including unique numbers built into the chip supplied with each telephone, to produce what the government claims is virtually unbreakable code.

The classified nature of the Clipper-Skipjack coding system is one big cause of complaint in the high-tech world, which teems with people who believe that they can find the secret flaw in the code any big bureaucracy creates. But the real outrage about Clipper stems from another feature: its "key escrow" plan.

Unlike scrambled messages produced by public-key systems, which for all practical purposes are impossible for anyone except the recipient to understand, the scrambled messages produced by Clipper phones are vulnerable to anyone who somehow obtains their codes, including the ID number built into each phone. Under the Clipper plan the federal government will maintain a master list of ID numbers for all Clipper devices ever sold. Each number—which is, in effect, the key to its phone's Clipper code—will be split in two, and each half will be "escrowed" by a government agency. In the right circumstances—namely, the production of a wiretapping warrant—the two halves will be reunited and the government will be able to intercept and understand everything said on that phone. As Whitfield Diffie said of the Clipper system soon after it was announced, "The effect is very much like that of the little keyhole in the back of the combination locks used on the lockers of schoolchildren. The children open the locks with the combinations, which is supposed to keep the other children out, but the teachers can always look in the lockers by using the key."

That the government will have long-term possession of all secret Clipper keys is the reason the NSA and the FBI favored the proposal—and also the reason that nearly all of the high-tech community lined up against it. Privacy advocates were upset enough by the idea that the government would retain the keys. Even worse was the possibility that the bureaucrats guarding the keys could be bribed to turn them over to criminals or to foreign governments, which could then listen in on all supposedly "secure" conversations. (The arrest of Aldrich Ames, the alleged mole in the CIA, came in the middle of the Clipper debate.) Companies that make computers, modems, and telephone systems have unanimously opposed the Clipper plan. They say that as technology has evolved toward international standards, it has grown complicated and costly to make two different versions of all equipment—one with Clipper, for sales in America, and one without Clipper, for export. (Customers in, say, France or China would not be eager to buy scrambling equipment to which the U.S. government permanently held the keys.)

Nonetheless, the Clipper plan required no legislation, and early this year the Administration issued the orders putting it into effect. Clipper now exists as a standard for government contracts and purchases. Most of the secure, or encrypted, communications equipment sold in this country is sold to federal agencies (the military, the intelligence services, and law-enforcement divisions). Henceforth these agencies will not buy secure systems unless they are based on the Clipper chip.

Was this a wise decision? No. Is this the end of American civil liberties? I think not. The very trait that makes me wish the government had not adopted the Clipper standard also means that Clipper is not worth getting upset about.

This trait is the voluntary nature of the Clipper scheme. From now on, anyone who talks on a government-owned secure phone will be talking on Clipper, but no one else is obliged to use a Clipper phone. Nor is anyone forbidden to develop, sell, or use other encryption schemes. The three inventors of the RSA code, for example, have formed a company, RSA Data Security, that sells public-key coding programs for less than $ 500. Americans are as free to use them as they were before the jackboot of Clipper crashed down.

Both parties to the dispute had reasons not to emphasize the voluntary nature of Clipper. For Clipper's proponents, this was the obvious, gaping flaw in the entire scheme. The stated reason for a scrambling chip that permits wiretapping is that otherwise terrorists, drug dealers, and other criminals might use untappable scrambling schemes. With Clipper they still can.

The government at least acknowledged this objection, and tried to answer it in two ways. The first was to argue that most criminals would never hear about Clipper chips, or would forget about them when they were planning the big heist. "Thank God most criminals are stupid!" Jim Kallstrom, the FBI's special agent in charge for New York, told me in March. "If the smartest segment of the population ever went into crime, we would really have a problem. Will some criminals catch on to the system, and buy their encryption from, let's say, Israel? Yes. Will that be a problem? Yes. But it will be a substantially smaller problem than if we didn't do anything."

The government's other response has been to emphasize its ability to "make markets." If big companies are producing Clipper because that's all they use when doing business with the government, then in the long run Clipper encryption will crowd out rival schemes. Apart from the government itself, the biggest customers for coding systems today are banks and credit-card companies, which use encryption to safeguard the billions of financial transactions they handle each day. By establishing Clipper as a standard, the government hopes to keep encryption, especially public-key systems, from becoming so cheap and convenient that anyone can walk into a Radio Shack and buy a perfectly secure phone. This attempt to shut off the growth of a market enrages many of Clipper's foes—but its very indirectness illustrates the limits of Clipper's effects. It is guaranteed to be least effective against the most serious criminal opponents, such as state-sponsored terrorist rings that will not be limited to what they can find at Radio Shack.

Yet Clipper's opponents have also downplayed the voluntary factor, because it makes adoption of the system seem less nightmarish. When pressed on this point, they have usually reverted to the "slippery slope" argument: today Clipper is voluntary, tomorrow it will be mandatory—and all other coding schemes will be outlawed. There is no peacetime precedent to suggest that Congress would pass, or the courts uphold, such a mandatory measure, with sweeping controls on forms of speech. The likelihood of abuse would be reduced if Clipper's "key escrow" plan were changed in one significant way. Under current Clipper guidelines, the two agencies each designated to hold half the components of the secret keys will both be part of the executive branch. One is the Treasury Department; the other is the Commerce Department's National Institute of Standards and Technology. On familiar separation-of-power grounds, it would be safer for one of the "escrow agents" to be part of the judicial system, insulated from direct executive control. Those who are worried about Clipper no matter who its escrow agents are can and should start using some other encryption scheme.

Calming down about Clipper might also keep the hyperbole of the battle from spilling over onto the Digital Telephony bill. Although this proposal's name sounds futuristic, the proposal itself is principally an attempt to maintain the wiretapping powers the police and the FBI have had for more than twenty-five years.

Under the Safe Streets Act of 1968 as amended in 1970, local telephone companies must cooperate with lawful wiretap requests from the FBI. Most states have passed laws based on this act, setting similar standards of their own for legal wiretaps. Law-enforcement officials emphasize the many hurdles they must surmount before a judge grants such a request. The police or the FBI must show probable cause to suspect a crime, and demonstrate that all other investigative possibilities have been exhausted. Because of these requirements, fewer than 1,000 wiretap requests are granted each year.

According to the FBI, the conversion of telephone systems to digitally switched networks and the adoption of call forwarding and the like make it increasingly difficult to carry out lawful wiretap orders. Older phone systems had access points that police could use to put a tap on a single line; this was often as simple as attaching two alligator clips to a telephone wire. Newer systems lack simple mechanical access points and will become increasingly difficult to tap as time goes on. Therefore the FBI contends that for purely technical reasons, rather than for legal or political cause, the police are losing the ability to tap phones. The Digital Telephony bill would require telephone systems to build in the means for tapping on whatever new equipment they introduce.

Like any other on-line habitué, I am unenthusiastic about the whole idea of wiretaps. I realize that FBI Director Louis Freeh has himself sounded hyperbolic in making the case for the bill ("Without a new statute, law enforcement at the federal, state, and local level will be crippled," and so on). Yet I also recognize that no one could make a political or legal case that the wiretap powers granted since 1968 have been so grossly abused that they must now be revoked. These powers have, after all, been in place through the worst of the Vietnam War years, Richard Nixon's rise and fall from power, much of the Cold War, and the heyday of Oliver North, without leaving any powerful evidence that they have brought us noticeably nearer a national police state. (The opponents of the Digital Telephony bill also say that it would make too easy the collection of information about individual calling patterns, paving the way for Big Brother-style surveillance. Yet the bill would not at all relax the legal controls on how such information is collected or used.)

If the current ability of the police to tap phones is wrong, then this power should be withdrawn the normal way, by changing laws—not through the back door, as an unintended consequence of a technical change. So far the evidence suggests that the police have on the whole used the power responsibly. Therefore they deserve to retain it, under similar legal controls, through the Digital Telephony bill.