The core damage now unfolding in as many as four Japanese nuclear reactors after Friday's post-quake tsunami doused back-up cooling system generators is already becoming stock material for nuclear nihilists. Once again nuclear's perennial opponents are out in force hoping to use this episode to beat back nuclear power.
Greenpeace USA is asking its supporters to email the president and Congress with the suggested language "It's time to invest in clean, renewable energy. Not risky and dangerous nuclear power." This weekend Reuters reported on green parties in France, Italy, and Germany hurriedly parlaying Japan's reactors into a new ploy to push their respective governments away from nuclear. "We cannot master nature, nature rules us," Germany's Green Party parliamentary leader told the news agency.
After a partial core meltdown at Three Mile Island (TMI) in 1979, the American nuclear industry stalled out, overcome by NIMBY-ism. The country built one more nuclear reactor after TMI. While TMI injured no one, the accident wrought lasting devastation on the country's prospects for energy independence with its perfect storm of an ill-timed disaster movie and a sea of media misinformation. The University of Texas at Austin co-hosted a lecture and panel discussion last week on energy's portrayal in the movies. Michael Webber, a mechanical engineering professor at the university and an expert in energy and environmental policy, summed it up thusly, according to the event's Twitter feed: "The China Syndrome did for nuclear what JAWS did for sharks."
Thanks to TMI, the United States has long lost much of the manufacturing infrastructure to efficiently build new full-scale gigawatt nuclear power plants. There are two plants under construction in Georgia, the first after this needless moratorium, and they are now so expensive that President Obama needed to approve a $8 billion loan guarantee (of an estimated $14 billion construction cost) so that Southern Company could obtain its financing for the huge project.
We shouldn't allow America's re-emerging nuclear industry to be swallowed up by a wave of misinformation following this devastating tsunami.
Nuclear is a key element of plans to minimize the impact of global warming. At 17 tons of carbon dioxide per gigawatt-hour, nuclear energy production actually emits less CO2 than wind (solar has nuclear beat by three tons per gigawatt hour). For reference, coal emits over 1,000 tons and natural gas over 600 tons for the same amount of energy. Facts like these have already swayed many environmentalists into the pro-nuke camp. Al Gore has carefully teetered on the edge of full-on nuclear support for some time now.
Holdouts refer to "safe energy" rather than "green energy" alternatives, aiming to exclude nuclear energy which towers over the green energy landscape as the most reliable source out there. Safe energy is a semantic trick along the lines of "pro-life," similarly implying a false dichotomy. That's because American-led nuclear design upgrades expected to power grids by 2020 change the terms of debate.
Meet the iNuke: small modular reactors chock full of elegant design innovations, built cheaply, operating efficiently, and buried underground for your protection. One of the most promising and practical designs is from Corvallis, Oregon-based NuScale. They've got the safest reactor design the industry has ever seen, one that's drawing a scrum of serious shoppers as the company prepares to file for Nuclear Regulatory Commission approval next year.
"We think we're approaching a breaking point where plants are getting so complex and so large that it's reflected in cost," Dr. Jose Reyes, NuScale's chief technology officer, told the audience at MIT's Energy Conference this month. Dr. Reyes drew intense interest from his rapt audience of MIT students and energy industry players at the conference's panel on small and medium nuclear reactors. He quickly got swarmed as his panel came to an end. I had to catch up with him later.
The Department of Energy funded Dr. Reyes and his Oregon State University team in 2000 as part of the Nuclear Energy Research Initiative. "We were commissioned to come up with a design that was small, compact and could be built easily," he told me, initially in the hope that it could be used in developing nations. The team built a functional model unit at their university lab as proof of concept, and NuScale used that lab while overhauling their plans into a commercially viable option. They ended up with a product that costs one-third of a traditional nuclear plant.
The NuScale mini reactor carries a risk of core damage of once in 100 million years. To put that expanse into some perspective, 100 million years ago, flowers had yet to evolve, and dinosaurs roamed the earth. It was the height of the Cretaceous period. Peer reviewed science will not back me up on my next assertion, but I think I have a fair shot of spontaneously turning into a porpoise about once in 100 million years.
We really shouldn't be awed by this kind of technological development, as unreal as it must seem to Green partiers everywhere. Japan's tsunami-drenched reactors are 40 years old. In the same time, space flight has evolved from impractical government-sponsored rockets and shuttles to the dawn of Virgin Galactic. Nuclear just slimmed down too.
In this case, about 70 percent of NuScale's reactor design is similar to the most recent iterations of water-cooled enriched uranium reactors that are common worldwide and share a lineage to those in Japan. But a lot of mutation can occur with a 30 percent DNA swap out, and in NuScale's case, all of it points to safer operating conditions.
For one thing, rather than the huge gigawatt reactor of the kind capped by a massive concrete dome, a NuScale plant has up to 12 individual reactors made up of self-contained modules, each immersed in water and encapsulated by steel. The 65-foot long by 14-foot wide modules will be manufactured under controlled conditions at a central factory and shipped to sites, dramatically cutting down costs. Four modules can power the city of Madison, Wisconsin. All twelve can light up the entire metropolitan area of Memphis, Tennessee.
Because a NuScale plant is broken up into self-contained mini reactors of 45-megawatts each, the failure scenarios only pose so much risk, says Dr. Mohammad Modarres, professor of mechanical and nuclear engineering at the University of Maryland and an international leader in the science of probabilistic risk assessment.
Probabilistic Risk Assessments (PRAs) are the most rigorous step for any engineering design, and take months to produce after numerous computer models based data sources like materials analyses. Engineers begin by brainstorming hundreds of possible system failures, searching for any possible way that radiation could find its way into the environment, and then determine a frequency for every conceivable scenario, in the end determining the possible consequences for each outcome. Three Mile Island had just such a PRA, which was highly prescient and laid out a scenario of system and human error virtually identical to what happened there, but the plant's owners and federal regulators didn't heed its warnings at the time. PRAs earned new status in TMI's aftermath.
The Nuclear Regulatory Commission requires PRAs for new designs and established plants alike, using methods standardized by the mechanical engineering profession. Much as the Food and Drug Administration mandates a series of clinical trials to assess the efficacy and safety of a prospective drug, the NRC's layers of regulators and outside consultants examine PRA data. Dr. Modarres performed NuScale's PRA just as he has done for large scale reactors and presented his findings at last year's International Probabilistic Safety Assessment & Management Conference.
Dr. Modarres says NuScale is the safest reactor he's ever come across, 10,000 times less risk of any level of core damage than currently operating standard reactors, and it's 10 times safer than the Westinghouse AP1000 plants China's building now (pending NRC approval, it's also the design that will go up in Georgia).
So what if a natural disaster cut of external power to NuScale's water pumps that cool the reactor core -- the exact scenario now playing out in Japan?
It can't happen. There are no pumps.
In the NuScale plant, "You don't need a pump -- the heat creates a current of water by natural physics. Everything works by natural phenomena." Instead, the plant's steam generator tube is its weakest link, contributing the largest fraction of its 1 in 100-million-year risk. Physical properties of the tube which carries hot steam to the electricity-generating turbine come into play in the analysis. The tube could rupture, sending radioactive steam into the turbine and depriving the reactor core of water. In NuScale's PRA, Dr. Modarres created technical models of possible wear and tear of the pipe and its safety valves that could occur over the years. "We calculate through the 'physics of failure' an estimated frequency for this event," he says. The model gets even more complex, as the reactor automatically replenishes its water (or human operators can do so) through chemical volume control systems. The PRA models have to take into account the likelihood of this system failing too.
NuScale takes advantage of its design simplicity during its pre-NRC submission phase to change out elements and see how that affects its PRA (that pipe may get further tweaking before the NRC sees it). Such revisions on the fly aren't even possible with most larger more complex plants.
The reactor's size is one of its most important safety features. At 1/26th the size of standard reactors, there's simply less radioactivity to let loose. Dr. Reyes told the MIT crowd, "We've not only reduced the frequency of possible accidents, we reduced the consequences of accidents. That's huge psychologically." People who are afraid of flying don't really care about the frequency of accidents he says, it's the huge consequences of a single accident. "We've eliminated that."
Dr. Modarres explains the worst-case disaster scenario this way: "Even if one of the reactors fails and releases its radiation into the containment vessel, and again if that containment vessel fails and releases its material, it would be releasing that material into water, which is one of the best scrubbers for radioactive material." Ultimately a small amount of radioactive gases could make their way into the atmosphere, but mind you the whole operation is in an underground silo.
Since each module has its own separate metal containment vessel (unlike the traditional design of concrete domes that are getting repeatedly blasted in the hydrogen-fueled explosions at the Fukushima Daiichi plant), the probability of two reactors both leaking radiation is the probability of two independent extremely unlikely events. Even more reactors? Infinitesimal.
What about a major earthquake? "Being inside water, the forces that would be applied to the modules is much less. It's floating inside the water, so they wouldn't have as much force as if they were tied into the ground." Detailed seismic assessments are still being built into the model, and depend on location, but Dr. Modarres does not expect such events will increase major risk significantly.
Representative Edward Markey (D-MA) was sounding the alarm bells this weekend, yesterday sending a letter to President Obama requesting a moratorium on new nuclear power plants, reminiscent of European Greens. In light of the dire challenge posed by global warming, the growing destabilization in OPEC countries, and the ingenuity of America's nuclear engineers, the more appropriate federal response to the vulnerabilities in old nuclear plant designs that we share with Japan is to fully back nuclear energy 2.0 companies like NuScale.
“Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.”
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.
It’s a shame that the standard way of learning how to cook is by following recipes. To be sure, they are a wonderfully effective way to approximate a dish as it appeared in a test kitchen, at a star chef’s restaurant, or on TV. And they can be an excellent inspiration for even the least ambitious home cooks to liven up a weeknight dinner. But recipes, for all their precision and completeness, are poor teachers. They tell you what to do, but they rarely tell you why to do it.
This means that for most novice cooks, kitchen wisdom—a unified understanding of how cooking works, as distinct from the notes grandma lovingly scrawled on index-card recipes passed down through the generations—comes piecemeal. Take, for instance, the basic skill of thickening a sauce. Maybe one recipe for marinara advises reserving some of the starchy pasta water, for adding later in case the sauce is looking a little thin. Another might recommend rescuing a too-watery sauce with some flour, and still another might suggest a handful of parmesan. Any one of these recipes offers a fix under specific conditions, but after cooking through enough of them, those isolated recommendations can congeal into a realization: There are many clever ways to thicken a sauce, and picking an appropriate one depends on whether there’s some leeway for the flavor to change and how much time there is until dinner needs to be on the table.
A lab has successfully gestated premature lambs in artificial wombs. Are humans next?
When babies are born at 24 weeks’ gestation, “it is very clear they are not ready to be here,” says Emily Partridge, a research fellow at the Children’s Hospital of Philadelphia.
Doctors dress the hand-sized beings in miniature diapers and cradle them in plastic incubators, where they are fed through tubes. In many cases, IV lines deliver sedatives to help them cope with the ventilators strapped to their faces.
Each year, about 30,000 American babies are born this early—considered “critically preterm,” or younger than 26 weeks. Before 24 weeks, only about half survive, and those who live are likely to endure long-term medical complications. “Among those that survive, the challenges are things we all take for granted, like walking, talking, seeing, hearing,” says Kevin Dysart, a neonatologist at the Children’s Hospital.
They’re stuck between corporations trying to extract maximum profits from each flight and passengers who can broadcast their frustration on social media.
Two weeks ago, a man was violently dragged off a United Airlines flight after being told it was overbooked. And late last week, American Airlines suspended a flight attendant after a fight nearly broke out between a passenger and the crew, over a stroller. What did the two incidents have in common? Both stories went viral after passengers’ videos showcased the rotten conditions of flying in coach today. But also, in both cases, it’s not particularly clear that the airline employees caught on camera had many better options.
On the infamous United flight, employees, following protocol, had to call security agents to remove a passenger in Chicago, due to a last-minute need to transport crew to fly out of Louisville the following day. United’s contract of carriage gives employees broad latitude to deny boarding to passengers. On the other hand, it is terrible to force a sitting passenger to get up and de-board a plane. So, the attendants were stuck: Either four people already seated had to leave the plane, or a flight scheduled the next day would have been grounded due to the lack of crew—which would have punished even more paying customers.
Will you pay more for those shoes before 7 p.m.? Would the price tag be different if you lived in the suburbs? Standard prices and simple discounts are giving way to far more exotic strategies, designed to extract every last dollar from the consumer.
As Christmas approached in 2015, the price of pumpkin-pie spice went wild. It didn’t soar, as an economics textbook might suggest. Nor did it crash. It just started vibrating between two quantum states. Amazon’s price for a one-ounce jar was either $4.49 or $8.99, depending on when you looked. Nearly a year later, as Thanksgiving 2016 approached, the price again began whipsawing between two different points, this time $3.36 and $4.69.
We live in the age of the variable airfare, the surge-priced ride, the pay-what-you-want Radiohead album, and other novel price developments. But what was this? Some weird computer glitch? More like a deliberate glitch, it seems. “It’s most likely a strategy to get more data and test the right price,” Guru Hariharan explained, after I had sketched the pattern on a whiteboard.
The Justice Department said it would withhold jurisdictions’ federal funding if they don’t start playing ball with immigration authorities. In his ruling, Judge William Orrick said those threats were empty.
A federal district court in California on Tuesday blocked the Trump administration from enforcing part of a January executive order to defund “sanctuary cities,” ruling that the directive likely exceeded federal law and unfairly targeted those jurisdictions.
“Federal funding that bears no meaningful relationship to immigration enforcement cannot be threatened merely because a jurisdiction chooses an immigration-enforcement strategy of which the president disapproves,” federal judge William Orrick wrote.
The preliminary injunction blocks the federal government from enforcing Section 9(a) of the executive order nationwide while legal proceedings continue. That section authorized the attorney general to “take appropriate enforcement action” against “sanctuary jurisdictions” that “willfully refuse to comply” with Section 1373, a provision in federal immigration law that bars local jurisdictions from refusing to provide immigration-status information to federal agents.
Film, television, and literature all tell them better. So why are games still obsessed with narrative?
A longstanding dream: Video games will evolve into interactive stories, like the ones that play out fictionally on the Star Trek Holodeck. In this hypothetical future, players could interact with computerized characters as round as those in novels or films, making choices that would influence an ever-evolving plot. It would be like living in a novel, where the player’s actions would have as much of an influence on the story as they might in the real world.
It’s an almost impossible bar to reach, for cultural reasons as much as technical ones. One shortcut is an approach called environmental storytelling. Environmental stories invite players to discover and reconstruct a fixed story from the environment itself. Think of it as the novel wresting the real-time, first-person, 3-D graphics engine from the hands of the shooter game. In Disneyland’s Peter Pan’s Flight, for example, dioramas summarize the plot and setting of the film. In the 2007 game BioShock, recorded messages in an elaborate, Art Deco environment provide context for a story of a utopia’s fall. And in What Remains of Edith Finch, a new game about a girl piecing together a family curse, narration is accomplished through artifacts discovered in an old house.
The Hulu show has created a world that’s visually and psychologically unlike anything in film or television.
Call it luck, call it fate, call it the world’s most ridiculous viral marketing campaign, but the first television adaptation of The Handmaid’s Tale is debuting on Wednesday to audiences who are hyper-ready for it. The 1985 speculative fiction work by Margaret Atwood has featured on library waitlists and Amazon’s top 20 for months now—partly in anticipation of the new Hulu show, and partly in response to the strange new landscape that emerged after November 9, wherein women in the millions felt compelled to take to the streets to assert their attachment to reproductive freedom. (When the release date for The Handmaid’s Tale was announced in December, people joked that it would likely be a documentary by the time it arrived on TV screens.)
… and other evolutionary questions for an anthropologist who studies ancient teeth.
Given all the fuss modern humans are told to put into our teeth—brush, floss, drink fluoridated water, go to the dentist to get tartar scraped off twice a year—I’ve wondered how our ancestors made do. What did their teeth look like?
Peter S. Ungar’s new book, Evolution’s Bite: A Story of Teeth, Diet, and Human Origins, is a deep dive into how the teeth of our ancestors have changed over time. Ungar is an anthropologist who specializes in teeth. With patience and the right expertise, ancient molars can help reveal the diets of our ancestors. “Teeth,” Ungar writes, “are ready made fossils.”
The book also doubles as a recounting of his career, which has run the gamut from watching monkeys in the Indonesian rainforest to repurposing mapping software for the topology of ancient teeth.
The early results out of a Boston nonprofit are positive.
You saw the pictures in science class—a profile view of the human brain, sectioned by function. The piece at the very front, right behind where a forehead would be if the brain were actually in someone’s head, is the pre-frontal cortex. It handles problem-solving, goal-setting, and task execution. And it works with the limbic system, which is connected and sits closer to the center of the brain. The limbic system processes emotions and triggers emotional responses, in part because of its storage of long-term memory.
When a person lives in poverty, a growing body of research suggests the limbic system is constantly sending fear and stress messages to the prefrontal cortex, which overloads its ability to solve problems, set goals, and complete tasks in the most efficient ways.