The mere mention of a radiation is enough to make most humans squirm, but the truth is that we're surrounded by radioactivity. If you live in Colorado, for example, you may be getting three or four times the dose of natural radiation as you would if you lived in, Delaware, say. And yet no one I've ever met chooses the state in which they live based on the radiation levels present in the state, nor do Coloradoans seem to be suffering.
So let's propose the difference between the yearly dose of radiation in the mid-Atlantic versus Colorado as a standard for comparison. That would be about 67 millirems. You will also see these numbers reported in microsieverts. One millirem equals 10 microsieverts (or μSv as you're likely to see it). So, when radiation levels right near Fukushima were in hundreds of microsieverts (tens of millirems) per hour, you wouldn't want to be near there. On the other hand, when radiation levels reaching the west coast were estimated to be in the one microsievert (10 millirem) range, you wouldn't have much to be worried about.
Obviously, there are more nuanced ways of measuring radiation risk and the amount of time in which you absorb the radiation makes a difference, etc. But the Colorado principle strikes me as a good way to think about measuring your relative risk from radiation exposure. Here, I talk about this concept in a new video:See web-only content:
Update 8:19pm: This post originally used the incorrect letter to denote micro. We regret the error.
This article available online at: