Reckoning With the Legacy of the Nuclear Reactor, 75 Years Later
Journalists have always struggled to reconcile the destruction and the development ushered in by this famous experiment.
At the time, news of the breakthrough on December 2, 1942, was conveyed only in code: “The Italian navigator has landed in the New World.”
Our “Italian navigator” was Enrico Fermi, the physicist who had escaped fascist Italy for America. The “New World” was not a place but a time: the atomic age. On that day 75 years ago, Fermi’s team set off the first controlled and sustained nuclear chain reaction.
It all happened under the bleachers of University of Chicago’s Stagg Field. Fermi’s nuclear reactor was a pile of graphite, henceforth known as Chicago Pile-1. It produced all of a half-watt of power. But it proved that a neutron emitted by a splitting uranium atom could indeed split another uranium atom, which could split another and another, releasing energy with each reaction. With enough atoms, the chain reaction could unleash inconceivable amounts of energy. It proved, in other words, that an atomic bomb could exist.
The rest of the story is well-known: Bombs were made. Bombs were dropped. Hundreds of thousands of people died. A war was won.
As all of this receded into history, the anniversary of Fermi’s experiment has became a time to reflect on the legacy of nuclear science. “It’s always been a complicated story,” says Rachel Bronson, president of the Bulletin of the Atomic Scientists, the journal founded by former Manhattan Project scientists concerned about atomic weapons. Over the past 75 years, as the specter of nuclear annihilation has grown and waned and grown again, newspapers reporting on the anniversary have tried to grapple with that legacy.
The first time an anniversary of Chicago Pile-1 was commemorated publicly appears to be its fourth in 1946, and that was by proclamation of the War Department. In an October press release, Lieutenant General L.R. Groves, the commanding general of the Manhattan Project, suggested December 2 as the “birthday” of atomic energy.
The War Department helpfully released a packet of materials for journalists who were not present at the once secretive Chicago Pile experiment. Two public-information officers interviewed more than a dozen of the 50 scientists, and many of the small but colorful details that would be retold in later commemorations originated in their report.
Details like the bottle of Chianti wine, brought in secretly by the Hungarian-born theoretical physicist Eugene P. Wigner. When the experiment succeeded, Wigner opened the bottle. The participants drank out of paper cups and signed their names on the bottle’s straw wrapper.
And details like the graphite dust that blanketed everyone. (Graphite was used as a “moderator,” to slow down neutrons so they could split uranium atoms.) Albert Wattenberg, one of the young physicists that helped build the pile, told his interviewers:
“We found out how coal miners feel. After eight hours of machining graphite, we looked as if we were made up for a minstrel. One shower would remove only the surface graphite dust. About a half-hour after the first shower the dust in the pores of your skin would start oozing. Walking around the room where we cut graphite was like walking on a dance floor. Graphite is a dry lubricant, you know, and the cement floor covered with graphite dust was slippery.”
The Chicago Pile was a genuine scientific breakthrough, but other, more famous milestones like the Trinity test and the Hiroshima bombing have also been pegged as the beginning of the atomic age. Perhaps the War Department chose December 2, 1942, as the birthday of “atomic energy”—note: not “atomic bomb,” a phrase that never appears in the press release—because it represented a purer scientific achievement. Nuclear science had not yet been used for destruction; it could just as well power our homes and save lives through medicine.
When The New York Times covered the fourth anniversary in December, science writer William L. Laurence hinted only vaguely at “incalculable potentialities for good and for evil.” Laurence is credited with coining the term “atomic age” and he is a controversial figure in journalism. During the war, he worked for the Manhattan Project as its historian. Then he returned to the Times to continue reporting on the very project for which he worked, even winning a Pulitzer for his dispatches from Nagasaki. In 2004, journalists argued his Pulitzer Prize should be revoked because of his “uncritical parroting of propaganda.” He dismissed, for example, Japanese reports that people were dying from radiation days after the bombings.
“We will probably never know the true extent to which William Laurence was co-opted, compromised, or corrupted by his military and governmental connections and involvements. It appears that in many ways, he was never really certain himself,” Mark Wolverton recently wrote in Undark. But from the very beginning, the story of the birth of the atomic age was being written by the very people who ushered it in.
In 1952—now the 10th anniversary of the experiment—the Kentucky New Era quoted Arthur Compton, the physicist who oversaw Fermi’s work, speaking at a luncheon of the Chicago Association of Commerce and Industry. (Compton was the one who spoke the words: “The Italian navigator has landed in the New World.”) Compton defended the use of the bomb, but he was more eager to stress the civilian impacts of the experiment, emphasizing energy as the War Department’s press release did:
As a scientific tool, the importance of the nuclear reactor is comparable with that of the cyclotron. As a means of improving health, it may reasonably be compared with the betatron, a new type of supervoltage instrument for producing X-rays and beta rays ... As a means of defense, I would rate the atomic weapons as comparable in importance with the airplane. But the great significance of nuclear energy seems to be as a source of useful power.
When the 25th anniversary came around in 1967, World War II was receding from memory and the Cold War had come startlingly close to turning hot. It was atomic weapons that Americans were thinking about again. Volney Wilson, another physicist who worked on the Chicago Pile, speaking to the Schenectady Gazette, was far less optimistic: “It’s been a big disappointment to me ... I would have thought that the development of this horrible weapon would have been more of a force to bring the world together.” Wilson was a pacifist who was always ambivalent about building a bomb, but his words now had a note of bitterness.
The 50th anniversary came at a more optimistic time: 1992. The Soviet Union had dissolved. The United States was the world’s only superpower. The Soviet Union was not only dismantling its warheads, it was selling them to the United States for electricity. “Highly enriched uranium from former Soviet weapons once targeted on our cities will be used to light and heat those same cities as fuel in American nuclear power plants,” William S. Lee, president of Duke Power, said at a November 1992 meeting of the American Nuclear Society.
But, it was not lost on journalists that this was still the atomic age. Articles written for the 50th anniversary note that Russia and the United States still had enough nuclear weapons to kill millions, and several other countries were pursuing their own. “Fifty years later, the legacy of the Chicago Pile remains mixed,” Earl Lane wrote in Newsday.
Which brings us to the75th anniversary of the Chicago Pile. Nuclear power is on the decline in the United States today. Nuclear weapons are ever present in the news again. Yet nuclear science has also produced real breakthroughs in science and medicine. The legacy of the Chicago Pile is mixed, and it probably always will be—until, and such is the nature of nuclear weapons, the day it is clearly not.