The remains of supernova explosion known as Cassiopeia ANASA/ESA Hubble Space Telescope / AP

The aftermath of a star’s death can rival the events of a creation myth. When a star explodes in a supernova, hurling pieces of itself into the cosmos, it seeds new stars and new worlds with the raw materials required for life. In death, stars are reborn. But like all creation tales, this one has a dark side. Supernovae can rain radiation and death onto living worlds that already exist. And they might be able to change the course of natural history.

One such change might have happened on Earth sometime between 1.7 and 3.2 million years ago. A star about nine times the mass of the sun blew up, and the night sky glowed a bright blue for weeks, during which the supernova outshone the full moon. Long after the darkness returned, lightning set off by cosmic rays would have arced from the sky to the ground, and the planet’s climate may have changed. Animals on land and in the shallow sea would have been doused with waves of radiation. Over time, the influx of particles could have sparked mutations in DNA, making small alterations that could have shifted the course of evolution.

From our vantage point on Earth, supernovae appear suddenly; their name comes from the word for “new star.” Their brilliant, visible shine fades away within a few days or weeks, but they continue firing a stupendous surge of x-rays, gamma rays and speedy, energetic particles for much longer. Only recently have astronomers brought these supernovae down to Earth, by wondering how they might have interfered with the planet’s climate, and the evolutionary processes that were playing out on its surface.

Earlier this spring, astronomers used telltale evidence in seafloor sediments and moon-dust to study two nearby supernovae that blew up a few hundred light years away. One exploded between 1.5 and 3.2 million years ago and the other 6.5 to 8.7 million years ago.

Adrian Melott, a physicist at the University of Kansas, wondered about the timing of the more recent supernova. Its date range includes the timeline of a minor extinction event at the endstart of the Pleistocene, about 2.59 million years ago, one that was long thought to be caused, in part, by a cooling climate and dramatic regional changes in Africa and central America. Melott and others had wondered whether a supernova could shower enough particles and radiation on Earth to cause mass extinctions. Thanks to the new research on supernova history, they could now look into it in earnest.  

Melott ran computer simulations that suggested that even mild star-explosions would shower Earth with radiation for hundreds of thousands of years, provided they were local. They would also ionize the atmosphere to a level eight times higher than normal, which would trigger an increase in cloud-to-ground lightning.

“I really expected to conclude that there wasn’t much chance of an effect, because of the distance, but it turned out to be more substantial than I expected,” Melott says.

While Melott and his coauthors were working on this paper, which appears today in The Astrophysical Journal Letters, another supernova archaeology team was refining the most recent research on the two local supernovae from 1.5 and 3.2 million and 6.5 to 8.7 million years ago. Brian Fields, Brian Fry and John Ellis argue that those supernovae were closer than scientists thought — maybe only 150 light years distant, not 325. If that’s true, the radiation effect would be even stronger, Melott says.

The immediate radiation dose wouldn’t be terrible, roughly comparable to the amount you’d receive in a CT scan. But it wouldn’t be a one-shot deal. Instead, the radiation would rain down for hundreds of thousands of years. Melott says the particles would largely include muons, which are a sister particle of the electron and have more energy, so they can penetrate deeper, including into the oceans. They would also add up to a bigger effect on large animals, like mammoths, maybe, or humans. All told, supernova radiation could triple the everyday background radiation from cosmic rays.

“It would trivially increase your chances of cancer, but if you do it to every organism on Earth, for hundreds of thousands of years, there might be something you could see,” Melott says. “If you could have good enough statistics to look for bone cancer in fossils, for instance, you might be able to do that.” Melott said he had worked on research into whether this was actually possible.

Radiation is known to cause mutations in DNA in living organisms and in their sex cells, which leads to mutations and possible physical changes in later generations. A tweak in DNA here, a shift in chromosomes there can add up to substantive changes over time, altering the process of evolution.

To Melott, the real surprise was the increase in lightning. A major spike in atmospheric ionization would increase cloud-to-ground lightning, which might affect the weather, or at the very least might spark more wildfires. “That is one of the things we want to investigate, whether there is any evidence of increase in wildfire in the geological record,” he says. But he notes he’s not a climatologist, and it would be up to climate scientists to study how ionizing the atmosphere would affect the climate.

Of course, it would take a great deal of substantial evidence to tie a specific supernova to climate changes and mass extinction. That’s why it’s the modeling work that’s most interesting about this research. If a medium-sized explosion a few hundred light-years away would almost certainly do something—what, then, of huge explosions?

“This is not a major event as far as the Earth is concerned. Such a thing should come along on average every couple of million years,” Melott says. “But it means that the really nearby ones, that come along every couple hundred million years, could be quite devastating.”

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.