Radioactive elements on Earth are like geological watches. A radioactive isotope of carbon is used to date human civilizations, among other things, because we know that its half-life is precisely 5,730 years; count how much of the carbon 14 has decayed and you can get a pretty accurate measure of how old something is. (If half of the expected amount is left, you'd say, "This thing is likely 5,730 years old.")
But what if the rate of radioactive decay -- the watch -- was not constant? One minute, the second hand is moving at one speed, and the next it has sped up or slowed down. And what if what changed that rate of decay was solar activity on the sun, 93 million miles away?
That's what recent research at Purdue University suggests. In a slate of recent papers, physicists Ephraim Fischbach and Jere Jenkins argue that measured differences in the decay rates of radioactive isotopes cannot be explained by experimental errors. Instead, they seem to vary with the earth's distance from the sun and periodic changes in solar activity.
"We are led to suggest that nuclear decays may be intrinsically inﬂuenced by the Sun through some as-yet unexplained mechanism, possibly involving neutrinos," they wrote earlier this summer. The researchers got interested in the problem when a solar flare on December 13, 2006 (seen above) decreased the decay rate of a radioactive sample they were studying.