In the rush to commercialize nuclear power, proponents may have hampered its long-term prospects by settling on an approach to atomic energy that may not have been the best.
On June 10, 1964, Lyndon B. Johnson rode through the streets of Worcester, Massachusetts, cheered by 175,000 well-wishers on his way to give a commencement speech at Holy Cross. Looking out over the football stadium's cheering masses, dressed in the traditional scholar's robe, the Texan lawyer delivered a paean to science and technology's power to transform the lot of the world's poor for the better.
Spattered with bits of Christianity, his speech identified "ominous obstacles to man's effort to build a great world society." While paying lip service to disease, he concentrated on two other problems for which he had the same solution: poverty and "diminishing natural resources." The way forward against both these menaces was nuclear power.
"Let this be the year of science," Johnson said. "Let it be a turning point in the struggle--not of man against man, but of man against nature." There would be a technological ﬁx for the world's problems. There could be prosperity for all through exploiting nature more intelligently, largely through "our new capability to use the power of the atom to meet human needs." He declared:
It appears that the long promised day of economical nuclear power is close at hand. In the past several months we have achieved an economic breakthrough in the use of larger-scale reactors for commercial power. And as a result of this rapid progress we are years ahead of our planned progress. This new technology, now being applied in the United States, will be available to the world.
Through the magic black box of science, nuclear energy would be transformed into American soft power throughout the world. With unlimited power, all the world could be a Monticello--open for life, liberty, and the pursuit of happiness. Wealth would not have to be redistributed because there would be enough for everyone to live an American lifestyle.
As in the original Dwight D. Eisenhower "Atoms for Peace" speech, the specter of nuclear destruction--which, like it or not, was an American invention--was redeemed by the utopian visions of a perfect power. "We now can join knowledge to faith and science to belief to realize in our time the ancient hope of a world which is a ﬁt home for all," Johnson concluded. "The New Testament enjoins us to 'Go ye therefore and teach all nations.'"
Thus, nuclear power, long-supported by the American government with subsidies, was officially enshrined as the American energy technology of the future. The reactor was a cheap, clean, necessary answer to the problem of the bomb and the opportunity of the future. Or so Johnson's story went.
It was a grand American narrative: Science! Technology! Progress! Economic growth! Unlimited everything! What's not to love? It's more than a bit like the one we are telling ourselves about green technology.
Unfortunately, the kernel on which it was built--the "economic breakthrough" of nuclear power--was more truthy than true.
The New York Times ran a story about Johnson's speech on page one under the headline, "Johnson Reports a 'Breakthrough' in Atomic Power." They followed up with a series of stories, as did the other major newspapers. Word of a breakthrough in the cost of nuclear power was big news because everyone had been waiting for economically feasible nuclear power for a decade. After the heavy promotion of the early nuclear power days--exempliﬁed by Walt Disney's classic nuclear cartoon, Our Friend the Atom--nuclear power had stalled out with just a few demonstration plants in operation. The coal lobby smelled blood. In March of 1964 the coal industry assailed nuclear power, saying Congress needed to remove "the sheltering umbrella of Government subsidies."
General Electric and Westinghouse, who had helped build America's military and civilian nuclear program, were getting antsy that their knowledge would go to waste. "Our people understood this was a game of massive stakes, and that if we didn't force the utility industry to put those stations on line, we'd end up with nothing," as John Gitterick, a GE vice president, later told Fortune. It was this corporate desire to capture rents on a technology that only a few companies could provide that generated the "economic breakthrough" of Johnson's speech.
As soon as the words left Johnson's mouth, scientists at national laboratories around the country knew what he was talking about, even though he was a few months late with the announcement. When a Chicago Tribune reporter called Stephen Lawrowski, associate director of Argonne National Laboratory, the scientist told him that the president must have been talking about the guaranteed price that General Electric had offered Jersey Central Light and Power for the Oyster Creek plant. That announcement had "caused a ﬂurry" in scientiﬁc circles because the price GE was charging for the plant--$68 million for the 515-megawatt plant--made the plant economically competitive with fossil fuels. [Editor's note: Oyster Creek was a boiling water reactor with the same basic design and containment vessel as the Fukushima reactor in Japan.]
Yet the scientists knew from the available evidence that nuclear power was far from economically competitive in mid-1964. However, instead of setting the Tribune reporter straight, Lawrowski simply punted, saying "The New Jersey plant is a signiﬁcant milestone in nuclear power progress because it has affected thinking not only in America but also in Europe."
The price was a door-buster, a loss-leader, an advertisement for a nuclear age that had not actually yet arrived. The so-called "turnkey" plants, as they later became known, probably cost Westinghouse and General Electric over $1 billion combine, though they did not say that at the time.
Coal officials told the Wall Street Journal that GE had "priced the Oyster Creek plant at less than cost." A GE executive denied that, claiming the company would "make a slight proﬁt unless we run into some unforeseen difficulties." British and Russian engineers also called the estimates into question--and French officials unsuccessfully tried to get details out of GE. But American news accounts, though they reported those foreign doubts, always made sure to note the bias that national competition could introduce into other countries' expert opinion. None questioned the U.S. expert corps' own Cold War sympathies.
Newspaper reporters, with the help of sources within the nuclear industries, came up with stories to explain how prices could have fallen so far, so fast. But like a trend piece about raising chickens in Manhattan, they were little more than anecdotes strung together by plausibility and the public's desire to believe. Although they reported doubts about the breakthrough, they were often run deep inside the paper whereas the optimistic pieces led the sections of the paper. Even the most skeptical piece, a September 1964 article by Washington Post reporter Howard Simons, noting that "not all experts accept General Electric's ﬁgures," only questioned the ﬁgures within 12 percent. In reality, nuclear power would end up costing not $104 or $1,040 per kilowatt of capacity but more than $3,750 per kilowatt by the mid-1980s.
Perhaps Lewis Strauss, then-chairman of the AEC, overstated the case when he told a crowd of science writers in 1954 that "Our children will enjoy in their homes electrical energy too cheap to meter," but his optimism was obviously widely shared within the nuclear establishment. The country's political leaders were more than willing to believe and promote these technical promises. It was a wonderfully convenient solution to an America battling Communist agitation across the world.
And besides, nuclear proponents said energy usage would soar and they had nice graphs to back it up. Their vision was expansive, expensive, and rather brilliant. Technical reports came out purporting to show energy "needs" for Americans in the future that were spectacularly high. In 1960 the AEC, which had as its mandate to promote the commercialization of nuclear power, projected that Americans would use 170 quadrillion BTUs in 2000. In reality, that year Americans used about 99 million quads of energy. And we still do. Imagine adding 70 percent more power plants, cars, and buildings to our current energy infrastructure. It's nearly unthinkable.
Yet from the early 1950s until the energy crises of the 1970s, politicians accepted as gospel truth nuclear proponents' overblown visions of America's energy needs emanating from the nation's national laboratories and the AEC. Legislators continually delivered high-levels of steady funding to nuclear research.
Of course, the political relationship ran both ways. The AEC knew what the government needed and the government knew what the AEC needed. In both cases, the answer was: Don't stop believing!
Despite the occasional call for the free market to work, the opposite happened. For example, nuclear power plant operators are indemniﬁed by the U.S. government for catastrophic disasters (the Price-Anderson Act), thereby lowering their insurance rates. They were given preferential access to markets for borrowing money. There was plenty of informal and regulatory help to go with the R&D and commercialization boosts. In effect, the government socially engineered the cost structure of the industry so nuclear could compete with coal, which got to dump all its extra costs, such as air and water pollution, into the environment.
But even then, convincing utilities that they needed to go nuclear wasn't easy until General Electric hit on the genius idea of guaranteeing a ﬁxed price to risk-averse utilities, effectively subsidizing the cost of the construction. And Oyster Creek was born. If they could just build a ton of plants, they could learn and scale and standardize: Costs would drop. Westinghouse matched GE's pricing, and what came to be known as the "turnkey" plants were built. In the bandwagon market that followed until 1973, utilities ordered more than two hundred nuclear reactors. Nuclear power had arrived.
But the turnkey plant prices did not reﬂect the actual costs of building a nuclear power plant. As the years wore on, that nuclear power was not as cheap as coal and other fossil fuels became increasingly clear: The prestige of the nuclear authorities began to fall; nuclear whistleblowers came forward; environmental risks were reassessed, perhaps too stringently; the protest movements of the 1960s turned their attention to nuclear power and all the centralization of power it represented. It turned out that Americans were ready to extend democracy to technocratic decision making, and they did not like what they saw from the nuclear industry.
The nuclear industry operated as a closed network of thinkers and analysts, disregarding outside critiques of their methodologies and not taking the serious issues of nuclear power seriously enough. "One result of the regulators' professional identiﬁcation with the owners and operators of the plants in the battles over nuclear energy was a tendency to try to control information to disadvantage the anti-nuclear side," a former AEC commissioner admitted in the early 1990s. The very agency charged with regulating the industry--the Atomic Energy Commission--was also charged with promoting it, and that's just the most obvious conﬂict of interest. Nearly everyone involved in assuring the public of the economics, safety, and environmental wisdom of atomic power was also involved in promoting atomic power. Not all of them had economic interests at stake, but few were disinterested observers.
The coalition of scientists, reactor builders, and utilities neglected the social aspects of their technology. A more subtle type of blindness to the effects of actual success afflicted the nuclear crew as well: Success surprised them. Though they believed in the engineering idea of scale--bigger is better, bigger is more efficient, bigger is cheaper--with unerring faith, they tended to ignore the problems that scale would bring. The complexity of local and global politics, safety, construction, waste management, and plant siting were all underestimated. And that all cost money. The cost of the plants rose for many reasons, not just those that pro- or anti-partisans like to highlight.
Learning to run the plants well also took a long time. The capacity factors of those huge nuclear plants--how often the plants were actually generating electricity--were shockingly low. They hovered in the 58 percent range, which means that if we visited a plant on ten random days, it would not have been running for four of them. Since then the capacity has improved and is now over 90 percent, which is a testament to how good technology can become over time.
But that came too late. The energy futurology that served the industry so well began to break down. Energy demand growth did not just continue accelerating as they had anticipated. All the projections from 1960 through 1980 fell short by an average of 40 percent. We didn't need as much energy as we had anticipated. As a result, the vision of the nation's future that sold nuclear power never panned out.
Higher-than-expected costs, worse-than-expected operation, the meltdown at Three Mile Island, and the Chernobyl disaster all obviously hurt the industry with the public. A less well-known event might have changed history as much when on October 5, 1983, Cincinnati G&E announced that its Zimmer nuclear station would need 2.8 to 3.5 billion more dollars and two to three years of further construction time. Previously, the utility had claimed the reactor was 97 percent complete. "That news was the ﬁrst of many disastrous nuclear crises that followed," wrote Leonard Hyman, an investment banker who worked with the utility industry. "Utilities tottered on the brink of bankruptcy, scrambling for funds to complete troubled projects, or to salvage what they could from huge investments in projects that had to be cancelled despite the billions that had been sunk in them."
Investors got the message: Nuclear power was not a good investment, so they scurried away. The First Nuclear Era, as Weinberg called it, was over. No new reactors would be built in the United States for more than twenty-ﬁve years.
[Editor's note: Though new plants wouldn't be built, the design of reactors and structure of the industry *had* been been established. This week in the Atlantic's Future of Energy Special Report, we'll reopen the nuclear reactor design debate that existed before Oyster Creek and the other turnkey plants in light of the Fukushima nuclear disaster and progress on new reactors.]