When I am out on my bike, I am responsible for the area immediately around me, maybe 12 inches in every direction. The rest of the road is not my problem. I do not make eye contact with other bicyclists or motorists hurtling toward me, unless they are in my 12 inches. By not looking at them, I am making it their problem to not hit me, which of course they don’t. The drivers do the same thing. We are an army of high-speed somnambulists, purposefully behaving as though we are the only ones on the road.
It feels ridiculously dangerous, riding around those first few months—also, no one, me included, is wearing a helmet, although my excuse is that I haven’t been able to find a bicycle shop that sells them. But it becomes more and more evident that this is a normal, accepted level of risk here. Once, during a typhoon, I look out of a swimming taxi window and see ten cyclists casually skimming through the ankle-deep runoff, impervious, as if disposable ponchos were armor.
It’s easy to feel as if safety has a universal definition. Freedom from want, freedom from fear—aren’t those what people mean when they think of safety? Perhaps, but the routes through the world to that state of being are circuitous and varied. Smoke alarms, for instance, have been required in every American bedroom since 1993. We rarely think about them, except to grouse when they go off while we’re cooking. France, however, only began requiring residential smoke alarms in 2015. Switzerland, rated the safest country in the world in 2015 by one consumer-research firm, has not mandated them at all. There is not a simple, one-way progression from a state of nature to a state of safety. Even within nations, there are fundamental divisions about how we want to deal with risk.
* * *
Deciding what dangers to avoid sounds like a supremely rational process, on the face of it. You calculate the risk of an event (house fire, bicycle crash), the probability of the bad outcome (death), multiply them together, and get a number that tells you how likely the worst-case scenario is. Then you decide how you might defend against it. Get a smoke alarm. Wear a helmet.
The truth is, though, that at this point a number of things come between us and a rational decision. Over the last half century, researchers have uncovered systematic biases built into how we decide. These heuristics usually bring us to a good-enough solution swiftly, which may be one reason they’ve stuck around. But sometimes they generate peculiar errors.
We judge how likely something is, for instance, by how recently we’ve seen it happen. The psychologists Daniel Kahneman and Amos Tversky call this the availability heuristic. On the one hand, it can generate a patina of reassurance that blinds us to real dangers. We regularly put our lives in the hands of doctors, whose image in our minds is of benevolence and healing. However, recent research has suggested that medical error may be the third-most common cause of death in the United States—in part, it seems, because while medicine is indeed capable of wonderful things, preventable human errors are not as well controlled as they are in fields like nuclear power.