Science That Frightens Scientists: The Great Debate Over Dna

Molecular biologists now can alter the very stuff of life—they can combine genes into wholly new substances called “recombinant DNA.” Such experiments are the most exciting in contemporary science. They are also the most awesome, and they have provoked a grave debate among scientists. Many fear that their work may create dangerous and uncontrollable forms of life.

The rules of the game have not changed for upwards of three billion years: every living creature is dealt a genetic hand, the best stay in for another round. Five years ago in California a few biochemists learned how to stack the deck. They contrived a method for mixing, at will, genes from any two organisms on the planet. Genes cause a creature to be like its relatives and unlike anything else. They say, in a universal chemical language, “Wings, not feet; brown feathers, not blue; quack, not warble”; or “orange fruit, not yellow; pungent, not bland; round, not elongated.”

Despite the whimsical suggestion of Sydney Brenner, a respected British biologist, the new methods do not yet permit duck genes to be usefully linked with orange genes. They do permit the long, spiral molecules of DNA—the chemical that carries genetic information for ducks, oranges, and all creatures in between—to be cut and recombined to a scientist’s specifications. No one has yet assembled the genes for a quacking fruit, but potentially more useful mixtures are being concocted.

With the new cutting and splicing technique, scientists can attach any genes they choose to certain carrier genes. These linked genes, called “recombinant DNA,” are then conveyed inside a host cell, which virtually adopts the new gene as one of its own. The object of most current experiments is to transplant genes from animals or plants to bacteria. These tractable, microscopic cells proliferate immensely faster than rabbits. On a simple, cheap diet a single bacterium makes a few billion copies of itself overnight. Its DNA, including that of a transplanted gene, is reproduced at the same rate.

If foreign genes for insulin or blood-clotting factors could be made to work—not just be copied— inside their bacterial hosts, the “bugs” might become efficient chemical factories producing substances to meet the needs of diabetics or hemophiliacs. Although the medical bonanza has not yet materialized, bacteria containing recombinant DNA already promise to be exceedingly informative.

Every interesting plant and animal, including Homo sapiens, frustrates geneticists with its complexity. If each of the 100,000-odd human genes were individually transplanted to bacteria, they could all be replicated in large quantities and then separately studied in a relatively simple biochemical environment. In this way it would be possible to catalogue and describe the entire genetic complement of human beings. No easy task, but the goal is now attainable, at least in principle.

The analytic power of the new technology has led one biochemist to compare it with the invention of the microscope. But a microscope, unless used as a blunt instrument by a deranged medical student, can do no harm. The new genetic technology may be less innocent. Almost from its inception, scientists have worried that their genetic sleight of hand might confer new properties on the experimental bacteria. Old bugs might learn dangerous new tricks and might, if they escaped from a laboratory, demolish the intricate genetic balance that keeps all our chips in play.

The thought was scary enough to give pause to the very scientists directly engaged in gene transplantation. In 1974 a group of the United States’s most prominent molecular biologists publicly asked their colleagues to defer certain experiments and to proceed with extreme caution in performing others. Immediately labeled a “moratorium” by the press, the event was, as far as anyone knows, unprecedented in the history of science.

This dramatic gesture had a rapid and unintended public impact. Responding to the scientists’ hesitation, governmental and quasi-governmental bodies moved to ask whether the experiments ought to be conducted at all and, if so, under what conditions. Senator Edward M. Kennedy’s subcommittee on health held two hearings on the matter. The University of Michigan delayed construction of a specially designed laboratory until the regents voted on the ethical propriety of this particular scientific undertaking. Shortly thereafter, the city of Cambridge, Massachusetts, forced Harvard and MIT to a “moratorium” of its own. Some thirty members of Congress, in a singularly ungrammatical letter, asked Olin Teague, chairman of the Technology Assessment Board, to review the research. There were skirmishes between scientists and community activists in San Diego, New Haven, and Bloomington, Indiana. The New York attorney general’s office held hearings. The United States government, through the National Institutes of Health (NIH), took steps to regulate the work, and the Environmental Protection Agency began to review the NIH’s proposals. Great Britain officially and the Soviet Union unofficially followed suit.

At issue now is not just the future of a seemingly occult branch of modern biology. The overriding question is whether citizens not specially trained to understand modern science can influence the racing technology that in turn affects them intimately and to all appearances uncontrollably. Gene transplantation may be the first innovation submitted to public judgment before the technology had been put into widespread use and before heavy investment had given it a momentum that was hard to oppose.

How did this happen? Largely by inadvertence. The scientists desperately wanted to manage potential research hazards without public intervention. Some who realized early that a conspicuous debate might escape the confines of the scientific community implored their colleagues to resolve the issues quickly and quietly, while they were still in control.

Now that several levels of government are intruding into their once exclusive preserve, many scientists who formerly questioned the research have become its staunch advocates—as if hoping that the appearance of unity would persuade the public to go away again and leave researchers to conduct their business in peace. But the faÇade of unanimity hides the scars of a long struggle, one that the scientists have been unable to resolve for themselves and in which the public has become the referee of last resort.

The crisis began in the summer of 1971, when a young cancer researcher on Long Island placed a worried and angry phone call to an older scientist in Palo Alto. As he dialed through, Robert Pollack knew that he was making trouble. Pollack had cultivated a somewhat unusual career and sought to avoid the tunnel vision of which many in his profession are accused. He had taken time from his scientific career to teach biology to architecture students at the Pratt Institute and had served as host on a series of radio programs that discussed the potential misuses of human genetic engineering. Pollack’s atypical interests and a wit that some fellow scientists find abrasive rendered him a misfit of sorts at the Cold Spring Harbor Laboratory. Pleasantly situated near an old whaling village, once the capital of the American eugenics movement, the laboratory is now devoted almost exclusively to cancer research. (Pollack’s wife has coined a macabre, but apt, nickname, “Camp Cancer,” for the laboratory.) Under the influence of its director, James Watson, codiscoverer of the structure of DNA and author of The Double Helix, the Cold Spring Harbor Laboratory in recent years has devoted itself mainly to the study of animal cancer viruses. Pollack worked with such viruses, and he was not sanguine that they were being handled safely.

Cancer viruses, also called “tumor” viruses, remain poorly understood despite the hundreds of millions of dollars that have lately been devoted to studying them. Viruses are known to cause cancer in cats, mice, hamsters, and chickens, among other creatures, but no virus that definitely causes a human cancer has yet been identified. Technical difficulties impede progress with the more likely human viruses, so animal viruses have been the main object of study.

The best understood and most widely used is simian virus 40, or SV40, so called because it was originally found in monkeys. SV40 is small, even for a virus, and relatively uncomplicated; yet its effects are puzzling. All but harmless to monkeys, it causes cancer when injected into mice and hamsters. When SV40 infects laboratory preparations of normal human cells, it makes them act cancerous. When injected into people . . .

Nobody knows what will happen, but millions of people were inadvertently injected with SV40 in what became an altogether unsettling biological experiment. The virus was an unrecognized contaminant of polio vaccines given before 1962 and of some “cold shots" given around the same time. Because viruses can take years or decades to produce disease, there is no way of knowing what effect the mass inoculation with SV40 may eventually have. By the beginning of the 1970s, only a few thousand of those exposed to the virus had been studied, mostly over a short period. So far, SV40 has not been implicated in human disease, and currently there are no plans for further systematic evaluation of the silent, man-made epidemic.

Cancer researchers have adopted the pragmatic view that animal tumor viruses are generally harmless to people—an attitude now compounded about equally of evidence and faith. Certainly many scientists have thought SV40 incapable of producing human disease. Renato Dulbecco, a tumor virologist who won the Nobel Prize, has even been quoted as saying he would drink the stuff.

By the late sixties, however, rumors began to circulate that some technicians working with tumor viruses had contracted leukemia. One story even held that the same fate had befallen the son of a prominent investigator. Although the illnesses could not, eventually, be verified as originating in a laboratory, the stories served as an irritating descant over the chorus calling for a major research program on cancer viruses. However, when the Nixon Administration declared “war” on cancer, much of the funding did go to research on these viruses.

Pollack was getting edgy. His colleagues at Cold Spring Harbor were developing methods for making large, concentrated batches of SV40. In his view, such an undertaking increased the chances that people who worked with SV40 would carry more than a little of the virus away with them. Pollack was sufficiently concerned to give a special lecture on laboratory safety in a course on tumor viruses held during the summer of 1971 at Cold Spring Harbor.

His audience was not altogether receptive. In the course of discussions Janet Mertz, a talented young biochemist from Stanford, described experiments being planned in the laboratory where she worked. With a new technique for combining DNA from two different sources, her colleagues intended to put SV40 into a common bacterium, Escherichia coli (universally nicknamed “E. coli”). She did not see anything wrong with the experiments.

Pollack did. E. coli bacteria are benign passengers in the intestine of every living person. He had visions of modified E. coli, containing SV40 genes, as a kind of biological time bomb, which investigators could unwittingly pick up and transfer to people outside the laboratory. Then, sooner or later, the bacterium might pass its little package of cancer genes to a human host. Alarmed, and perhaps a bit righteous, Pollack placed his telephone call to Mertz’s senior professor, the man who was planning the experiment, Paul Berg.

Berg is anything but a mad scientist. An academic to the manner born, he has won awards for excellence in teaching, and by all accounts he deserved them. Carrying overtones of the lecture hall in his voice, Berg establishes the same rapport with colleagues that he does with students. He met Pollack’s unexpected criticism—and the much larger controversy that ensued—thoughtfully and with consummate diplomacy.

Berg had sound reasons for proposing his experiment, which was meant to open the field of gene transplantation. He knew that if he could move genes from other organisms to E. coli he could analyze them with hitherto unattainable precision. E. coli is a vehicle so thoroughly studied that biologists understand and manipulate it with affectionate familiarity, like adolescents gathered around an internal combustion engine. Rather than tackle human or animal genes right away, Berg wanted to study something simpler, SV40. The virus has only three genes, and only one of them holds the secret of its ability to cause cancer. To understand this one gene has been the goal of many highly competitive laboratories around the world.

When Berg received Pollack’s phone call he saw no compelling reason to abandon work that could lead straight to pay dirt. As Pollack later recalled, “He was absolutely dumbfounded, as far as I could see. I must have sounded like somebody coming and saving, ‘God will punish you.’ ” But Pollack’s worries about safety were valid, as Berg came to acknowledge. “I put him on the spot,” said Pollack, “because he’s an honest guy and didn’t have an answer.” In the wake of the phone call, Berg discussed his proposed experiment with colleagues. He found that many of them also thought it was too dangerous to perform. Berg ultimately agreed and gave up plans to put SV40 into E. coli.

Anxiety about gene transplantation later became more diffuse, but it began in the belief that certain experiments might amplify the risks of work with cancer viruses. The worst that could be imagined was a cancer plague spread by E. coli. Berg tried to determine whether such a nightmare could occur but came up against a severe information gap: no one had any real idea of the dangers posed by even the most conventional experiments with tumor viruses.

After six months, Berg returned Pollack’s telephone call and asked his help in organizing a meeting to review the available information and assess the possible risks. As Berg recalled some two years after that January 1973 conference, “I think what came out of it, frankly, was a recognition of how little we know.” The attending scientists did call for one specific step to be taken: a long-term program to establish whether people exposed to tumor viruses in the laboratory would eventually

Gene: A somewhat imprecise term for the unit of hereditary information. Scientists usually think of a gene as providing the distinctive information required to produce a single kind of protein chain.

DNA (deoxyriThe substance that genes are bonucleic acid): made of. DNA is the chemical record in which hereditary information is encoded. DNA itself is a long, chainlike molecule.

Recombinant DNA from two different DNA: sources spliced together to form a single chain.

Virus: Any of a variety of tiny objects that contain their own genes but are incapable of reproducing unless they enter a living cell. Viruses usually damage, kill, or derange the cells they enter and thus are responsible for many diseases. Viruses can infect animals, plants, or bacteria. Most viruses are about one thousandth the size of an average bacterium, and they cannot be seen with the light microscope.

Plasmids: Short lengths of DNA found in bacteria. Plasmids are not linked to the main DNA

show ill effects. But after a brief feasibility study the matter was dropped.

Berg, like some other scientists, now periodically takes blood samples from his laboratory associates and stores the serum, frozen in liquid nitrogen, “so that if twenty years from now something does pop up . . . the material would be available.” But blood samples collected in only a few laboratories and without a systematic program of observation will likely be as useless in 1997 as they are now.

Public awareness of the scientists’ worries might have resulted in political pressure to pursue the matter, but the public never found out. Not that there was a conspiracy of silence. The habit of scientists is to talk to each other in technical jargon about real concerns and to address the public paternalistically, reassuringly, and when they can feel good about what they have to say. News media, for their part, are capricious in what they report and, applying the usual “Dog Bites Man” canon, would hardly see fit to headline a story “Scientists Confess Ignorance.” But just six months after the tumor-virus meeting, a group of DNA researchers chose to blow their cover.

I find Gordon Conferences very difficult times to think,” said Maxine Singer, an authority on DNA and a career scientist at the NIH. “Basically one is trying to pour an awful lot of information into one’s head.” Many of these conferences are held annually in rural New England schools abandoned for the summer. They are elite sessions intended to foster rapid, high-level communication among specialists in a variety of areas. Everybody is there to talk science. As Singer put it, “People don’t come to a Gordon Conference expecting that they’re going to go home having taken some difficult public step.”

The DNA experts at a June 1973 conference, of which Singer was co-chairman, ended up doing precisely that. They heard that gene transplantation was now a reality and almost against their will found themselves having to decide what to do about it.

Although Berg had deferred his controversial experiment, work on the technique still moved ahead. Berg had used a laborious and somewhat awkward chemical method to link genes and carriers. In another Stanford laboratory, Stanley Cohen and his co-workers searched among their collection of plasmids for a carrier that would simplify the task. Plasmids are minute segments of DNA, usually only a few genes in length, that are carried by bacteria and sometimes transferred from one to another. Cohen’s group identified one plasmid that could be easily hitched to other genes by means of enzymes. They quickly established that DNA from yeasts or toads could be spliced onto the magic plasmid and then inserted into E. coli. As the bacteria multiplied, the transplanted genes were faithfully reproduced. Their method would work with genetic material from any organism, even human genes, and was simple.

chain of bacteria but are reproduced in the same fashion. They are sometimes passed from one bacterium to another.

Antibodies: A special group of blood proteins. Antibodies are capable of specifically recognizing substances foreign to the body. They become attached to these substances, which are then destroyed by certain cells in the blood. Occasionally, antibodies are made against normal tissues and become responsible for chronic diseases.

Bacterium: Any of a class of one-celled organisms that are smaller than animal or plant cells and simpler in structure. There are many different kinds of bacteria: some live on or in animals and people; a few of these cause diseases. Bacteria also live on plants and in soil or water. They usually reproduce by splitting in two, but are also capable of a kind of “mating ” during which genes are exchanged. Most bacteria are about onethousandth of a millimeter in diameter and are barely visible with a light microscope. “Bug” and “germ ” are colloquial terms.

A few of the younger scientists at the conference were the first to approach Singer, who is in her early forties, with their concerns about the new methods. Their worries were similar to Pollack’s, but broader: cancer viruses still played a role in many proposed experiments, although the original plan to put SV40 into a bacterium remained under embargo. More established investigators felt the meeting should not be subverted by speculations about research hazards, but Singer, backed by her co-chairman, raised the safety issue before the entire conference.

A special session was improvised on the last day. Fifteen minutes were all that could be set aside. Fifty participants had already left; ninety remained, and they agreed almost unanimously to send a letter asking the National Academy of Sciences to explore the implications of research with recombinant DNA.

Had it ended there, the question might have remained internal to the scientific community. But a second, and in retrospect extraordinary, vote was taken. By a much smaller margin, only six votes, the group decided to make the letter public. It was sent to Science, a respected and popular weekly for professional scientists and a journal read cover to cover by reporters for all the major news media. Upon receiving the letter Philip Abelson, the editor of Science, called Singer. He asked, point-blank and without preamble: “You really want to do this?” Abelson, Singer recalls, felt “that it was not wise to publicize it, that it would bring down all kinds of problems.” But she told him the conference had voted to have the letter printed, and it was.

Although its publication was hardly a media event, the letter brought about an irreversible shift in the arena of discussion. The public would henceforth have access to the scientists’ debates, and scientists would find it impossible to neglect or suppress the question of safety—however much they might eventually want to.

The problem of gene transplantation now fell to the National Academy of Sciences, which passed it back to Paul Berg. He, in turn, scheduled a meeting with seven colleagues for April 1974. As members of the Gordon Conference had feared, the new techniques were already eliciting many enthusiastic proposals for new experiments. Scientists who originally had thought the methods remote from their interests were discovering applications in their own work, and hazard or no hazard, laboratories around the world were tooling up to use them.

The Californians were being flooded with requests for the special plasmid used as the gene carrier. “When they called me I would ask what experiments they were doing,” Berg said later. “I was really shocked, because in many ways they were exactly the same kind of experiment that we had been forewarned about a couple of years earlier ... all of which seemed not so nice.” Berg and Cohen agreed that the plasmid should not be sent out until after the coming April 1974 meeting.

An hour after the group of eight scientists convened at MIT, all agreed that an international conference should be held as soon as possible. The conference could not, however, be scheduled before the following February, and by that time many questionable experiments might be under way. Finally, as Berg recalls, one of the group made a proposal that “came as a shock.” Berg’s colleague, an expert on the biology of bacteria, said, “If we had any guts at all, we’d just tell people not to do these experiments. Maybe what we ought to do is make some public announcement.” They did. Their letter, published in July 1974 in Science and Nature, called on biologists not to perform experiments that posed fairly obvious risks. Scientists were asked not to improve the antibiotic resistance of bacteria, not to put any kind of animal virus into them, and not to insert genes for known toxins, like botulism, into organisms that do not normally carry them. The group also asked their colleagues to “weigh carefully” putting animal DNA into bacteria. This request was actually related to their worry, by now several years old, about tumor viruses. Some kinds of tumor virus can get their own genetic code inserted as a silent passenger into the DNA of otherwise normal animals, and it was feared that cancer virus genes could inadvertently be put into bacteria along with transplanted animal genes.

For scientists to call even a temporary halt to their own work appeared to be truly unprecedented. In the late thirties a handful of atomic scientists, led by Leo Szilard, had asked their colleagues to stop publishing results that might obviously benefit the Germans. Even that effort failed until various governments intervened to classify research on nuclear fission, and by then the war was in full swing. The biologists’ request went much further: it affected research itself. And as far as anyone knows, it stuck.

What many people have not realized is that the letter dealt only with the most obviously risky lines of investigation. A year later Berg was to remark, “I think we’re getting a great deal of credit for either emotions or perceptions that I know I didn’t have. I wasn’t thinking about the social responsibility of scientists. . . . We just felt this was a way of telling other people the way we felt about it and asking them to think about it and hold off.”

The public tended to interpret the “moratorium” in another light. If scientists were banning some research, they reasoned, then all of it must be extremely dangerous. Newspapers throughout the United States and Great Britain fed this impression. Stories about the letter ran under headlines like “A Danger in ‘Man-Made’ Bacteria,” “Hybrid Molecule Test Threat Seen,” and “Bid to Ban Test Tube Super Germ.”

In planning for their international conference, the letter writers thought they would need to include only a few selected representatives of the press, but as interest built up several uninvited reporters insisted on coming. One of them even cited the Freedom of Information Act and threatened to sue if he was barred. In February 1975, some 140 scientists convened under obvious public scrutiny at the Asilomar Conference Center in Pacific Grove, California.

The oceanside retreat on the Monterey peninsula has nothing of the laboratory about it, but at Paul Berg’s instigation Stanford biochemists have driven there to hold annual meetings among its redwoods and pine trees. Believing that the location had contributed to the success of these meetings, Berg arranged with the California state park system to reserve Asilomar for the International Conference on Recombinant DNA Molecules. As the scientists arrived, monarch butterflies spread over the peninsula in their spring flight.

The 1973 conference on biological hazards had also taken place at Asilomar. Scientists left that meeting with a tacit agreement that research on cancer viruses could proceed as before, although certain safety precautions were advisable. The participants at the second Asilomar conference could not hope for such a painless outcome. They were meeting under a presumption of guilt—even though nothing demonstrably harmful had so far been produced in any experiment. Some gesture of selfregulation would have to be made.

The much deplored prospect of government supervision was, like the butterflies, unmistakably in the wind. Senator Kennedy had recently announced plans to re-evaluate the United States’s biomedical research policy, and Senator Proxmire, a perennial critic of government spending, had been particularly hard on research budgets. As one participant later put it, the goal of many at Asilomar was to “keep the Feds out,” but they had a hard time deciding exactly how. (The Feds did become interested, and Kennedy held his first Senate hearing on recombinant DNA only two months after the conference.)

The scientists came to Asilomar like the barons to Runnymede. Endowed with charisma and political savvy, they were used to running their laboratories as personal fiefs. To be sure, they had divided grant money, like the spoils of battle, in a ritual known without a trace of humor as “peer review,” but they were not in the habit of collectively making major decisions about research policy. Now 140 of these strong-minded individuals must spend a few days in a large room to forge an agreement they feared might affect them for decades to come. The clash of armored egos was noisy.

Berg, one of five people on the organizing committee, was the guiding presence at Asilomar; he would be responsible for whatever compromise the meeting finally reached. “As the meeting was going on,” he said later, “I was worried, largely because I could not see the consensus. People, I think, were being very self-serving . . . everybody would like to draw a circle around their work and stamp it as pure and unadulterated, and it’s what you’re doing which is nasty and needs to be proscribed.”

On the last evening, reality intruded, ominously, in the form of talks given by four lawyers at a session organized by Daniel Singer, Maxine’s husband. The abstract possibility that experiments were hazardous could have some very practical consequences, the lawyers said: researchers could be sued. A lawsuit by laboratory technicians, who are protected by the Occupational Safety and Health Act, would be bad enough, but if anything should escape from a laboratory to infect the outside world, the litigation could become unthinkable. Without putting too fine a point on it, the lawyers’ presentation highlighted the need for this often diffuse and rambling conference to reach some decision. The organizing committee stayed up all night to draft a statement they hoped would be accepted without prolonged debate, arguments, or a vote. “The reason we didn’t want to allow a vote,” recalled Maxine Singer, who was on the committee, “was because we thought we would be voted down. And yet we felt very strongly that we had to have something, that we couldn’t just leave with nothing.”

Bleary-eyed, Paul Berg presented the committee’s report to the final morning session of the Asilomar conference. He was, after all, forced to allow a vote, but as the voting progressed a surprised organizing committee found that their recommendations were almost unanimously accepted.

The Asilomar consensus was no mean achievement. It required that certain experiments be done in special laboratories. Many of the participants did not have immediate access to such facilities and they would have to persuade their research institutions to build them. Work with animal viruses, for example, was relegated to “moderate-risk" containment facilities. All air leaving such a laboratory—air that could contain dangerous microorganisms—must be blown out through expensive filtering systems. The experimenter in such a laboratory must wear gloves and handle experimental materials in special safety cabinets separated from the surrounding laboratory by a curtain of circulating air.

Much more stringent were the specifications for “high-risk” laboratories. These maximum-containment facilities must be separated from the outside world by air locks; personnel must shower and change clothing before leaving the laboratory. In the United States a facility at Ames Air Force Base comes closest to being an adequate high-risk facility. It was designed and built, in response to pressure from biologists, to provide quarantine for specimens returned from the moon. (Ironically, one of the principal advocates of a quarantine for moon rocks, Joshua Lederberg of Stanford, has derided the cautious approach to gene transplantation.) Even the Ames laboratory may be a million dollars away from meeting the standards finally adopted, some time after Asilomar, for “high-risk” work with recombinant DNA.

Such provisions for “physical containment” of microbes were cumbersome enough, but Asilomar also called for “biological containment” as a fail-safe measure. To this purpose bacteria were to be developed with special mutations that would allow not even one in a million to live outside the laboratory environment. No one at Asilomar knew when these bacteria would be ready. Some suspected that it would be a long wait, and they were right. Two years after the conference, no strain of bacteria has been certified. (The likeliest candidate, patriotically named X1776 by its developers, is still being tested.)

Many proponents of gene transplantation now argue that doing the experiments with ordinary bacteria on an open laboratory bench would be perfectly safe. Physical and biological containment, they imply, are superfluous measures to put the public at ease. But scientists at Asilomar clearly thought they were dealing with something that could prove to be more than a little dangerous. The meeting even went beyond what the organizing committee recommended; all but five of the participants voted to ban outright some experiments thought to be especially hazardous. The job was not finished, but Berg’s optimistic appraisal a few months later was, “I think we have planted the seed.” The fruit, however, proved bitter.

Last September—long after Asilomar by the accelerated time scale that has characterized this debate—two prominent Harvard scientists confronted each other in a crowded Unitarian church. The first to speak was George Wald, who won the Nobel Prize for his work on the chemistry of vision. Wald had been an early, outspoken, and passionate critic of the war in Vietnam and now is something of an elder statesman among the radical scientists, of whom there are many, in the Boston area. Wald’s long hair and attire—gold pendant over his black turtleneck jersey—were reminiscent of the sixties and their struggles. Although the churchhoused meeting was secular, Wald’s style was homiletic: “Our ignorance is profound.” There are many diseases that still appear from unknown sources, he observed, pointing to the recrudescence of swine flu and the mysterious Legionnaire’s disease. The new experiments could multiply these incidents. “If the worst happened under this recombinant DNA research, you’d never be able to identify the disease, still less trace it to its source,” he said; “it would be just a mystery.”

Then Wald’s opponent, Matthew Meselson, chairman of the biochemistry department at Harvard and, unlike Wald, an expert on DNA, took the pulpit. Meselson’s timeless Ivy League haberdashery did not signal his own political past. For more than a decade Meselson fought, with ultimate success, against American involvement in chemical and biological warfare. It is said of him, with little hyperbole, that he “single-handedly” closed Fort Detrick, the Army’s research and development laboratory for anthrax, encephalitis virus, and other disease organisms. Meselson also led the attack on the use of herbicides in Vietnam.

He answered Wald with “a little experiment” in the psychology of fear. In his normally soothing, avuncular voice, made even more low-key by a case of laryngitis, Meselson slowly intoned the word “unknown . . . unknown.” Then, “cigarette . . . cigarette.” “If you’re like I am,” he said, “you probably develop more skin moisture in response to the word ‘unknown.’ But that’s just a word—cigarettes really are dangerous.” Meselson then argued, in the most oratorical voice he could muster, that a vital line of research was now in danger of being halted by irrational fear of the unknown, which included the fear of mysterious diseases just expressed by “my colleague George.”

The laughter that followed Meselson’s “experiment” was at least a little tinged with embarrassment; an allegedly scientific debate between two investigators with impeccable humanitarian credentials had somehow reached the level of scare tactics and amateur theatricals. But their confrontation was only one in a series of increasingly confusing and emotional debates that had begun almost immediately after Asilomar. Although that conference had accomplished much, it had also left a great deal undone; and the unsolved problems seemed to multiply the more they were examined.

The guidelines sketched at Asilomar offered ample protection from fairly obvious risks, foreseen as early as the Gordon Conference. Strict safety precautions were established for work with tumor viruses and for research that could make any bacterial species more resistant to antibiotics. Experiments with genes for known toxins, or genes taken from bacteria and viruses dangerous to people, were to be deferred indefinitely. Such conventional hazards soon began to fade from attention. Replacing them was a new threat-a form of genetic roulette ominously christened “shotgun” experimentation.

In a shotgun experiment, all the DNA of an organism (such as a fruit fly) is chopped into segments containing at most a few genes. The different segments are then separately inserted into bacteria, where they are mass-produced for further study. Because only a few of the genes in any organism have been identified, most shotgun fragments of DNA will necessarily be unknown quantities. Some of these fragments might prove nasty if they augmented a bacterium’s ability to produce disease. In theory, some shotgun experiments could be as dangerous as experiments banned by common consent. Others, undoubtedly the vast majority, would be perfectly safe. It would be impossible to know in advance which is which.

In certain cases, animal genes can be identified and purified before they are grown in E. coli. The silkworm’s gene for the fiber it makes is one of these. Some scientists have worried about transplanting even these genes to bacteria on the grounds that they might acquire different properties in the bacterial environment. Perhaps more to the point, methods of preparing purified genes depend on materials derived from cancer viruses. It is exceedingly difficult to remove all traces of the viruses’ genes from these materials; and a careless experiment could produce the result everyone has sought to avoid: a cancer gene transplanted into a bacterium. Remote as the risks would appear to be, concern about them was surfacing while a policy was developed for work with more obviously hazardous materials.

Berg and the other organizers of Asilomar were primarily occupied with tumor viruses, which posed dangers that, if not proven, were at least easy to imagine. They left the more speculative problems of work with animal and plant genes to the people who were actually planning the experiments. Donald Brown, who headed the Asilomar panel on the use of DNA from higher organisms, had been putting DNA from toads and silkworms into E. coli; moreover, he was known to believe that such experiments were completely harmless. The conflict of interest posed by having researchers draw up the safety guidelines for their own experiments did not go unnoticed. After it was over, some charged that the Asilomar conference had been self-serving and had skirted the real issues because, as one particularly caustic observer put it, “this was probably the first time in history that the incendiaries formed their own fire brigade.”

Such criticisms were not groundless. For example, Asilomar decreed that it is safer to do experiments with DNA from cold-blooded than from warmblooded creatures, although the rationale for this decision is speculative at best. According to Maxine Singer, “it was a scientific matter with political overtones, clearly, because there were people who had done experiments with cold-blooded animals who wanted to continue.” One of the most influential of these was David Hogness of Stanford, who had been doing shotgun experiments with DNA from fruit flies.

Conflict of interest became a real issue when a committee of the National Institutes of Health began the tedious job of translating the mandate of Asilomar into firm guidelines for all researchers receiving NIH grants. Hogness was the chairman of the subcommittee that drafted the guidelines, and other members of the NIH committee were involved in experiments similar to his. The committee codified the standards for biological and physical containment that had been sketched at Asilomar; moderaterisk and high-risk laboratories were now called “P3” and “P4” respectively, and “safe” mutated strains of E. coli—though not yet available—were classified as “EK2" or “EK3.” (The NIH rating system has overtones of the movies’ G, PG, R, and X designations. Among its many details, it specifies that children under age thirteen may not set foot in a P3 facility. At P4, the lower limit is fifteen.) It was up to Hogness’ committee to assign the rating for each type of experiment.

The first draft of their guidelines to be made public—after the committee had weakened Hogness’ original draft—fell short of the mark. It was laxer than the Asilomar recommendations, and Berg was said to be critical, although he took no public stand. Others were more vocal. Five researchers, calling themselves the Boston Area Recombinant DNA Group, began writing critiques of the guidelines and organized a petition against them. They were joined by a closely allied, Boston-centered group, Science for the People.

In mid-decade America, Science for the People, an organization of politically radical scientists,may be one of the only vital groups remaining on the Left. Its recent efforts have been focused on such matters as occupational and community health, the potential misuse of scientific inquiry for political purposes, and the incipient dangers of human genetic engineering. Because gene-transplant research seemed to engage all these concerns, it was an ideal issue for the organization to take up. Jonathan King, a prominent member of the group, was quoted in Science as saying that the function of the NIH committee was “to protect geneticists, not the public,” and that having Hogness in charge of writing the guidelines was like “having the chairman of General Motors write the specifications for safety belts.”

King in many ways embodies the contrasts to be found in members of Science for the People. A professor at MIT, he is an accomplished molecular biologist, a man whom Berg has called a “very fine scientist.” Yet he has long been vocal in attacking the scientific Establishment. King speaks with boyish animation, slightly hunched over, gesticulates freely with his gangly arms, and grins almost constantly— with disarming effect. He prefaces his scientific predictions with, “I’ll bet you a chocolate milkshake . . .”

King’s opposition to gene transplantation is based on prophecies about the behavior of both bacteria and scientists. He is adept at describing dreadful scenarios that might result from seemingly innocent experiments. Like most opponents of the research, he is not much concerned with madmen working in toolsheds, or secret—and now illegal—military research. These are theoretical hazards, hardly amenable to regulation in any conventional sense. Moreover, the Army’s past failure to develop any really useful biological weapons is reassuring in terms of its possibilities for future developments.

King says he is worried—and he says it with a visceral intensity—that an animal or human gene transplanted into bacteria will not only be replicated but will go on to produce a protein, as it normally does in the body. If bacteria capable of manufacturing insulin or another protein were to infect a human host, he speculates, the host could be stimulated to produce antibodies to that substance. The antibodies would then turn against the body itself and produce chronic illness. King’s reasoning on this point is somewhat subtle, but there are, in fact, bacteria that deceive the human body into attacking its own tissues: rheumatic fever and late syphilis are thought to be examples of such a process.

Opponents of the research can devise many other scenarios, and part of their case is the very fact that horror stories are so easily imagined. If only a few of them should come to pass, we would be in deep trouble. Advocates contend that their critics’ speculations gloss over many built-in safety factors. A single gene, they point out, does not usually accomplish anything. Several genes must work together to make a functional product, and it seems unlikely that a bacterium would provide the proper environment for synthesis and release of animal or human proteins.

All the disasters are predicated on an assumption that bacteria will escape from the laboratory and enter human hosts, who would then unwittingly pass them on to their friends and relations. As Robert Pollack pointed out over five years ago, E. coli, the main experimental organism, has a particular affinity for the human gut. King argues that precautions are not stringent enough, and, even if they were, laboratory workers would violate them through simple human error and lack of real concern about the consequences. Proponents of the work say the precautions are adequate, and, anyway, the particular strain of E. coli used for these experiments cannot survive more than forty-eight hours in the human intestine; by then it self-destructs. King counters that forty-eight hours is a long time for E. coli, the equivalent of as many as a hundred generations.

Some scientists have proposed that the research be deferred until a different bacterium could be introduced for research purposes. However, advocates and critics alike seem to agree that this maneuver would only substitute another set of problems for the present ones. Bacteria that live in soil, for example, might leak into the environment even more easily than E. coli.

The arguments against transplanting genes into E. coli are, so far, altogether speculative, and the logic of the speculations is not airtight. Every imaginable risk, it seems, can be waved away by someone who thinks of a sound biological reason why it could not happen. But loopholes in the reassurances can also be found, and bona fide scientific debates about the hazards soon acquire the character of an infinite regress.

Science for the People and various individuals opposed to the research presented their arguments with some force as the NIH guidelines were being reviewed. Right or wrong, their criticisms altered the course of events. The NIH committee was reconvened in the fall of 1975, and by December it announced a set of guidelines that were even more strict than the Asilomar recommendations. (For example, Berg’s original experiment, putting SV40 into E. coli, could have been done in a moderate-risk laboratory after Asilomar, but now required a high-risk facility and so was once more under effective embargo.) Science for the People takes much of the credit for this change, and some of their detractors ruefully agree. Donald Brown recalls, “Dave Hogness called me up one day and said, ‘Our committee’s being inundated with letters from Science for the People and groups of this kind. . . . Won’t you please express yourself on the other side?’ ” According to Brown, public pressure on the committee was too influential. “They were just going back and forth like yo-yos,” he said; “terribly responsive to outside lobbying rather than being really objective and thinking about it.”

Brown had his say at a public hearing called by the director of the NIH in February 1976, almost exactly a year after Asilomar. The purpose of the hearing was to review the most recent draft of the guidelines. Science for the People and the Boston Area Recombinant DNA Group, predictably, sent representatives to argue that the guidelines were still unsafe; also predictably, Brown and Hogness opposed “the Boston crazies” and said that the guidelines were already too strict.

But the most forceful arguments came from an unexpected quarter. At this hearing, Robert Sinsheimer, chairman of the biology division at Cal Tech, took the radical position that all experiments involving an exchange of genes between bacteria and higher organisms were unpredictable and therefore unsafe.

No one would typecast Sinsheimer as the leader of a crusade. He is as diffident as King—who once worked in his laboratory—is outspoken. Yet the questions raised by Sinsheimer have had a profound effect on arguments about recombinant DNA. For years he has contemplated the potential benefits and risks of genetic engineering, specifically of methods for changing the genetic makeup of human individuals. While warning of dangers, he has advocated developing techniques to replace faulty human genes. (Current experiments with gene transplantation are a far cry from human genetic engineering, but they are a first step.) As a cautious advocate of genetic engineering, Sinsheimer told an interviewer for Science, “I thought of very careful experiments to replace gene A with gene B—it never occurred to me that anyone would do a shotgun experiment.”As his worries intensified, Sinsheimer propounded an argument against transplantation of genes into bacteria.

The thrust of evolution, he observes, has been to develop organisms made of increasingly complex cells– cells with a biochemical strategy quite different from that of more primitive organisms. As far as is known, evolution has never backtracked to cross the genetic barrier between bacteria on the one hand and animals or plants on the other. Yet experiments with gene transplantation accomplish precisely this. What, asks Sinsheimer, may be the result? The dangers of interfering with this apparent evolutionary barrier are purely speculative; but that may be less a proof that the hazards are unimportant than a demonstration of ignorance. “The point,” says Sinsheimer, “is that we will be perturbing, in a major way, an extremely intricate ecological interaction which we understand only dimly.”

The only solution seen by Sinsheimer, short of banning the research, would be to carry out all the experiments at one isolated spot in the country, much as high-energy physics is done at a few facilities scattered around the world. While such a plan would be feasible, it would certainly restrict the number of people able to do the research, and it would keep from universities and other institutions the expected influx of funds to support this work.

Sinsheimer is often dismissed by advocates of gene transplantation, always with a respectful nod to his distinguished career, as someone engaged in “handwaving speculation.” But he had driven a wedge into a flaw running through the foundations of the NIH guidelines; because no one knows what the hazards might be, no one knows how to guard against them.

Proponents of the research then began to indulge in counterarguments in the evident hope that repetition would give them the authority of fact. They modestly began by pointing out that genetic exchange between higher organisms and bacteria could have taken place, and that we might just be ignorant of it. As the debate persisted, however, gene transplantation was defended again and again by scientists who claimed, without data, that such genetic exchange was very likely to have occurred all the time. They conveniently ignored the fact—at least in public—that what they were proposing, if proven, would entail a substantial revision of evolutionary doctrine.

Sinsheimer’s concerns and the debate they engendered did not deter the NIH committee from polishing the guidelines and preparing to release them.

Most American universities raced to do the new research. Newly formed safety committees planned how to refurbish old laboratories to meet the NIH standards. It seemed that a fully certified P3 laboratory would soon become as important a status symbol in academe as an Astroturf football field.

One institution, the University of Michigan in Ann Arbor, proceeded with caution. Michigan had received a federal grant to pay most of the $306,000 needed to upgrade one of its laboratories to the P3 level. Several of the faculty had come to believe that gene transplantation, like the military research disputed a few years before, was work that a university simply should not do.

The debate over DNA research went on at Michigan for a year, as other universities looked on with some apprehension. Finally, in May, the regents prepared to hold the last public hearing and then vote.

They met in a huge, orange-brick structure—a legacy of the student demonstrations and administrative paranoia of the late 1960s. Almost an exact cube, the fortresslike administration building has hardly any windows to break and none at all in the regents’ meeting room. It was an open hearing, but one conducted on the university’s well-protected home ground.

No experimental scientists spoke against the research. Biologists were conspicuously absent— perhaps because a colleague’s grant funding is not lightly disputed. Nor were there representatives of Ann Arbor’s nonacademic community, except for a young member of the Ann Arbor Ecology Center, who expressed his concern that “a decision of such magnitude could be made without assuring the full community its proper voice.” The regents reminded him that informational forums had been held. The citizens of Ann Arbor were clearly regarded as observers, not participants, in what was, after all, a matter of university policy. Exit the opposition.

Enter the defense. The dean of the college of literature, science, and the arts. Dean of the graduate school. Dean of the medical school. Chairman of the division of biological sciences. Chairman of the department of microbiology. Their message to the regents was pointed: the new genetic technology would revolutionize biology, and the University of Michigan could start using it before just about anyone else. One professor stated openly what the others implied: “During the past fifteen to twenty years we have let slip through our fingers numerous opportunities to establish a strong celland molecular-biology program here. At the moment, the university has a favorable position with respect to research on recombinant DNA. It is perfectly clear that other universities will proceed with . . . research on this subject. Should Michigan choose not to, we will lose our position; those interested faculty will of course go elsewhere; our recruitment in this and related areas will falter; and we will suffer a blow to our excellence.”

A few days later, the regents announced their vote: six to one in favor of the research. No one was surprised, but not everyone was happy.

The regents’ final go-ahead was made possible largely by a faculty committee of humanists, whose report, sent to the regents two months before, had overwhelmingly endorsed the research. There was just one dissenter in the group: Shaw Livermore, a professor of American history. About fifty years old, with a gaunt face, disheveled, long, graying hair, and the intense manner of an abolitionist, Livermore looks like a person who has lost sleep over moral issues.

Livermore was incensed that he had been lumped with those who opposed the research solely on grounds of safety. He was, he said, convinced that university scientists had acted responsibly to guarantee the experiments would be done safely, but that did not reassure him. He had been impressed by Sinsheimer’s arguments and went a step further. Livermore saw the gene transplant technique as one of incredible power and wondered, simply, if we would all be able to handle it. He wrote in his dissenting statement that if the research is successful, “man will have a dramatically powerful means of changing the order of life. I know of no more elemental capability, even including manipulation of nuclear forces. ... It should not demean man to say that we may now be unable to manage successfully a capability for altering life itself.”

The night before his committee voted its recommendation. Livermore had gone out to walk in the woods behind his house. It was a long walk in which he had to decide “whether or not I could vote for this thing.” He could not.

An intuitive stand like Livermore’s is hard to defend. It is much easier to do what other opponents of the research did at Michigan: produce statistics about the number of laboratory accidents that have occurred in the past ten years and argue that that is the real problem. Livermore was virtually alone, surrounded, in his words, by the “barely heard sounds of quiet desperation.”

In Cambridge, Massachusetts, those noises were gaining volume.

Cambridge, although it harbors two worldrenowned universities, is not quite a university town. Its politics are fueled by a running feud between Harvard and MIT (especially Harvard) and the city government (especially the mayor). Mayor Alfred Vellucci periodically threatens to turn Harvard Yard into a parking lot and, somewhat more facetiously, the Harvard Lampoon building into a public urinal. The university, for its part, has been developing large segments of Cambridge real estate. In Ann Arbor, city and university interests have rarely collided; in Cambridge, on the other hand, rhetorical collisions are a way of life.

So last summer, when a Boston weekly newspaper sported a front-page headline declaring “Biohazards at Harvard,” it had an immediate effect. The article’s authors, who had consulted with the Boston chapter of Science for the People (but did not mention the organization), detailed Harvard’s plans to build a P3 laboratory for experiments with gene transplantation. They described a controversy over these plans, one that had been smoldering among the university’s biologists.

Mayor Vellucci quickly decided to call a public hearing on the matter, and was already scheduling it as George Wald arrived in his office to encourage him. The brouhaha in Cambridge took place just three years after the small band of scientists in a New Hampshire school building had decided to put their reservations into the public record. Now the public was asking scientists to explain themselves. Science had been dragged into City Hall.

The old wood-paneled city council chambers, filled to the balconies, held only a fraction of the people attending the televised hearing. Graduate students from Harvard and MIT, inside the city hall for the first time, were jammed together with longtime Cambridge residents who had never set foot in Harvard Yard.

Under the bright lights sat the most pained people in the room: the scientists who had been summoned by Mayor Alfred Vellucci. Physical discomfort and sweat were the least of their problems. Suddenly their careers hinged on the ability to defend highly technical biochemical work in plain English, a language some of them had not spoken for years. They could not just defend putting DNA into E. coli at P3 and EK2. Glaring at the assembled scientists, the mayor opened the hearing by telling them to “refrain from using the alphabet. Most of us in this room, including myself, are lay people; we don’t understand your alphabet. So spell it out for us.”

The anger in the room was unmistakable. Community resentment of Harvard, a long-latent unease about science in general, and distrust of the federal government combined to clobber Harvard and MIT scientists who wanted to do ominous experiments with federal money. In the mayor’s eyes, it was Cambridge against the world, and Cambridge had a chance of winning.

Maxine Singer made a special trip from Washington to defend the NIH guidelines, which coincidentally had been released the morning of the hearing. After some badgering from the mayor about whether or not she was “on the Harvard team” (and an additional interchange in which Vellucci jokingly asked the attractive scientist for her telephone number), she testified that the guidelines “provide a high degree of confidence that hazardous agents won’t escape.” But this wasn’t enough for Vellucci; he wanted to know why the NIH, which would fund the disputed experiments, had not told him and the city council about them. “If it wasn’t for some of these newspapers that we have around Cambridge,” he railed, “we wouldn’t have known nothing about this. . . . We got nothing from you, nothing from Congress, nothing from Harvard–nothing from anybody. Is this any way to run a government? We caught Harvard just in time.”

The calm, reassuring, and discreetly condescending tone that some of the scientists adopted only made the mayor and council bristle more. Mark Ptashne. a Harvard biologist with interim responsibility for the proposed laboratory, stressed confidently that “no known dangerous organism has ever been produced” in a gene-transplant experiment. But a councillor shot back: “Just what the hell do you think you’re gonna do if you do produce one?”

By asserting that they personally believed their experiments to be safe, Ptashne and other proponents paradoxically raised their audience’s level of suspicion. Jonathan King responded to them: “If you have a committee of people charged with protecting us from a danger, and these people are willing to get up in public and say there’s absolutely no danger, there’s nothing to worry about, the facilities are supersafe, the bugs won’t make you sick—well, that is the last person that I am going to entrust with protecting my health.” He was applauded enthusiastically.

King and other members of the Boston chapter of Science for the People had gone to Ann Arbor when no scientist there would speak out against the research. They were not able to convince the University of Michigan’s regents, but they had a ready audience in the Cambridge City Hall.

The council had to do something the political climate in Cambridge now demanded it. One councillor reported that his conversations with Cambridge residents were running “thirty to one” against the research. Dramatically, the mayor decided that the best way out would be to prohibit all recombination experiments for a very long time. In a show of belligerence he announced that he was submitting a resolution to ban the work outright for two years, over Ptashne’s plaintive objection that this would wreck the plans of all biochemists and most biologists—a very large number—within the Cambridge city limits.

When the city council reconvened two weeks later, the mayor’s pyrotechnics had been defused. The council called for a moratorium of only three months and then only on experiments requiring at least a moderate (P3) level of containment. No such research was under way or really imminent in Cambridge. The universities readily agreed to observe the brief moratorium. (In fact the moratorium has been extended.)

Then, as might be expected, the council created a committee to evaluate the situation. They empowered the city manager, James L. Sullivan, to form a nine-member review board. But Sullivan transformed a traditional bureaucratic maneuver into an epoch-marking event. He appointed exclusively nonscientists to the committee. Scientists had not been able to agree so far, he observed, and there was no reason to expect they would change now.

Meanwhile, the biochemists could relax, at least temporarily, about their careers. But as time wore on, and the outcome remained in doubt, they became more and more militant about their intention to do the new research. Molecular biologists like David Baltimore and James Watson, who originally sided with the more cautious faction, began to push very hard for the freedom to do the new research within the NIH guidelines.

Watson has a reputation for caring a great deal about safety. Some years ago he even threatened to sue a colleague who proposed bringing a virus suspected of causing human cancers into Harvard’s biological laboratories. But Watson also cares intensely about his own career and those of his protégés. He has repeatedly stressed that safety measures must not interfere with the goals of his basic research, even while he has gone on public record as doubting that such research will yield results for cancer treatment within the next hundred years.

It is all but impossible to sort out the motives of scientists on both sides of the controversy. Ambition is rather obviously a factor, as are personal animosities. For example, one of the irascible Watson’s most distinguished colleagues at Harvard privately opined that he would not mind having an Andromeda-strain facility in the biological laboratories—if the nuclear self-destruct device could be planted under Jim Watson’s desk. Since then, Watson has resigned his Harvard professorship.

The experimentation review board has met in Cambridge’s city hospital, a location far removed in spirit from the scientists’ tasteful retreats in New Hampshire and California. Twice weekly, a heterogeneous, but hardly ordinary, group assembles to hear and review testimony about the safety of experimentation with recombinant DNA. They have acquired an almost talmudic familiarity with the federal guidelines and placed themselves on a first-name basis with the entire biochemical community of Cambridge. The very existence of this board is what the scientists had been more or less consciously trying for five years to avoid. Now, a nun-nurse who administers a general hospital, the owner of a heating oil company, a well-to-do housewife, a structural engineer, two physicians, a philosopher of science, a social worker, and a nine-to-five employee of the Carter’s Ink Company have been asked to evaluate the safety of the hottest research in modern biology. The board’s views are advisory and supposedly will affect only the city of Cambridge, but nobody doubts that this consummately local group will influence decisions made throughout the world.

Some scientists testifying before them have, not very subtly, compared the proceeding to the Inquisition, the Scopes trial, and the Lysenko affair. The insult aside, this evaluation seems misplaced. The citizens’ group is not preparing to make a doctrinal judgment; they do not care what the scientists say but rather what they do, and then only if it seems to threaten the health of their townsfolk or the already degraded environment in which they live.

Humankind has come, in the last two decades, to feel itself increasingly the victim of scientific assault and battery. The litany of known and suspected hazards—DDT; cyclamates; asbestos; 2,4,5-T; PCB and PBB; vinyl chloride; DES; Freons—is oppressive and all too familiar. On balance, excessive restraint in using these gifts of modern technology has been less common and less costly than enthusiastic acceptance.

Gene transplantation was just another threat, but it pushed a lot of latent worries over the threshold. The prospective experiments did more than scare the public. They scared the scientists themselves, and first. Why? Experiments with recombinant DNA are almost certainly safer than many laboratory procedures—possibly safer than conventional work with tumor viruses and cancer-causing chemicals, which have never received intense public scrutiny.

In part people have worried that molecular biologists would become like the sorcerer’s apprentice. If an altered and consequently harmful bacterium should escape the laboratory, it might spread widely in the environment and ensconce itself all but irreversibly. The toll of such a mistake could be vastly greater than with more conventional experiments, which as a rule injure only the people directly exposed to them.

It was clear from the very beginning that gene transplantation could greatly accelerate genetic research, and that fact, although exciting, was also frightening. As Sydney Brenner observed at Asilomar, “Science has built-in pauses; some last one hundred years. But the thing about recombinant DNA engineering is that it suddenly has made many things very easy that were once very difficult.” If the rate of scientific discovery gets faster and faster, how long before even experts cannot keep up?

Whatever the dangers of gene transplantation, they can only be measured by careful, long-term study of those most at risk—the experimenters, their assistants, and ancillary personnel. Ordinary plagues are easily recognized: like canaries in a coal mine, people start to die or get conspicuously sick in short order. By contrast, we might expect that organisms carrying transplanted genes would have subtle effects, and these not measurable for years or decades. Because no one knows what specifically to look for, the general health and death rate of exposed individuals would have to be monitored over a period as long, perhaps, as twenty or thirty years.

Expensive and troublesome as such an investigation might be, it could pay off. If we are to learn anything about the risks of this research, scientists may have to live with the cost and inconvenience of careful epidemiology—and with the unpleasant findings such studies may turn up.

The DNA dispute has done more than focus attention on the possible dangers of “pure” science; it has encouraged nonscientists to ask, out loud, whether they really espouse the goals of research that does not obviously benefit them. Some advocates of the experiments, like Matthew Meselson and Mark Ptashne of Harvard, argue that science is important for its own sake, that scientific knowledge is a cultural value and should no more be subjected to utilitarian criticism than music or painting. Their view may be at least a partial recognition that defending science on utilitarian grounds is not as easy as it looks.

Nobel laureate and MIT biochemist David Baltimore, in an interview, acknowledged “a sense that all of the magnificence of modern science has not brought qualitative changes in people’s feelings about the world, has not eradicated poverty, has not made the population more intelligent, aware, comfortable ... In fact, most of the scientific progress now is hidden to everybody except the technological world.” Like many scientists on both sides of the controversy, Baltimore has been active against military misuse of science. He opposed war research at MIT during the Vietnam War. (His own career has been devoted to research with viruses that cause or might cause human disease.) Bearded, wearing a blue work shirt and jeans, he sat in his kitchen under a McGovern poster and next to his daughter’s green plastic alligator as he reflected, with a composed intensity, on the unexpected journey he and his colleagues have taken.

Baltimore is clearly fatigued by the problems the debate has caused. “I’m ambivalent,” he said, “about whether in the end it was a good thing to do or not. I think that we had a responsibility to do it and I don’t see how we could have avoided that responsibility. But we haven’t provided what we wanted to provide, which was a smooth transition from a kind of laissezfaire policy in science to a rational evaluation of hazards ... I think the situation has escalated to the point where it’s an inappropriate response.”

He is probably right. The rancor generated by the dispute hardly makes sense as a specific response to this particular line of experimentation. In calling scientists to account for their work with genetically modified bacteria, the public and its representatives seem to be seizing an opportunity to establish a more general principle. A recondite area of modern biology has become a metaphor: the ultimate question is not whether bacteria can be contained in special laboratories but whether scientists can be contained in an ordinary society. □