Why We're So Bad at Managing Risk

More

The streets at Edwards Air Force Base, where the majority of both the Air Force and NASA aeronautical flight testing and research takes place, are not named for generals. They're named for pilots killed on test flights. It's a reminder to all who work there that the junction between technology and nature can be a dangerous place, especially when new technology is reaching into previously unexplored areas, or areas whose characteristics and behavior are not well known or understood. 


The deep ocean, of course, is every bit as unexplored and little understood as the upper atmosphere. And as the oil spill in the gulf is underscoring with increasing emphasis each day it continues, the dangers there are every bit as high. BP is not new to offshore operations or the risks inherent in drilling into the earth. So how did the company misjudge the dangers, risk and consequences of an accident so badly? 

As the news about the spill keeps getting worse and worse, there's more focus on what appears to be egregious lapses in safety and risk management at BP and the Minerals Management Service (MMS) in the lead-up to the disaster. The relationship between regulators and industry was far too cozy. Managers dismissed cautionary reports by their own engineers and scientists about the unknowns and potential risks the project would entail. Financial incentives trumped safety concerns in terms of which safety systems were put in place. 

Appalling as all those revelations are, they wouldn't shock anyone who's ever worked in an industry where technology and nature come together for a commercial venture. Engineers at rocket manufacturer Morton Thiokol cautioned about potential problems with the Space Shuttle Challenger's o-rings but were overruled by management. Concerns voiced to NASA about the temperature at launch were dismissed, as well. The Shuttle launched on schedule and blew up 73 seconds later. 

How do such management disasters occur? The easy answer is, there's a financial incentive for going forward, and a financial disincentive for holding back. Program managers are rewarded for meeting budget and schedule milestones and obtaining results. Safety generally works in opposition to all that. 

In addition, risk is always theoretical until an accident occurs. It's harder to argue for something that hasn't occurred before, and might never occur at all. So safety sometimes gets short shrift against more tangible commercial gains and public image. NASA, for example, had a schedule to keep with the shuttles, and the vehicles had been sold to Congress and the country as reliable, reusable space transportation systems. Too many safety delays would affect not only the budget, but the perception of the program, which could endanger its future funding. 

But the reason we're so bad at risk management goes far beyond that. In his 1995 book Risk, British researcher John Adams explains a whole set of factors that influence how we view and manage -- or mismanage -- risk. 

For starters, Adams quotes Frank Knight, who wrote a book called Risk, Uncertainty and Profit back in 1921 (proving that this is far from a recent problem). Knight made an important distinction between risk, which he defined as "if you don't know for sure what will happen, but you know the odds," and uncertainty, which he said was when "you don't even know the odds." And uncertainty, by definition, is unpredictable. 

In retrospect, one could make a good argument that BP didn't manage the risks in its deep water drilling effectively because its managers and engineers didn't really understand the forces they were dealing with. Which is to say...they didn't even know the odds. So they weren't managing risk. They were groping in the darkness of uncertainty, pretending they could see. 

But even when we're dealing with known risks, we do a poor job of managing them. For one thing, humans have a propensity to alter our behavior in ways that negate attempts at risk management (something Adams calls "Compensation Theory"). When seat belt laws got passed, people started driving more aggressively. Likewise, when there's a safety or back-up system in place, people are often willing to push further into riskier territory (like drilling five miles beneath the ocean). And we often misconstrue luck (at having nothing go wrong) for proof that an activity actually contains an acceptable risk -- a belief that gets stronger the longer we go without an accident. What's more, risk taking is tangibly encouraged and rewarded in American culture and business. Just look at the bonuses given out on Wall Street, and the way we idolize entrepreneurs and the risk-taking entrepreneurial spirit.  

In addition, we all view environmental risk differently. Some people take a more optimistic view of life and nature. They have faith in the possibilities technology can bring and believe nature can probably absorb whatever hits humans impose on it. At the other end of the spectrum are people who view nature as very fragile and vulnerable. To this group, almost any environmental risk is unacceptable.  In between are people who see both the potential benefit of technology and exploration, and the potential risks involved, but believe that regulation can make risk "manageable." These differences are relevant because what we view as an "acceptable" risk or safety procedure depends, in large part, on which of these worldviews we hold.    

These individual filters have a greater impact on how we go about risk management, Adams says, because decisions about risk take place in situations where exact answers about danger and consequences cannot be known for sure or "proven." They are "guesses" about the future. Risks, after all, are possibilities, not certainties. If they were certainties, they wouldn't be called risks. They'd be called costs. So because there is no certain answer, our individual worldviews and other considerations (like profit, or an engineer's need to consider the interests of their employer and their professional reputation as well as the public good) have more room to influence how we view the risk in question, or how strenuously we argue for our concerns. 

So what to do? Unfortunately, Adams doesn't have a nice, neat answer to the problem. But a couple of examples from aviation are instructive. The airlines have achieved an impressive safety record through strident efforts at redundancy and repetition. Airliners have at least two pilots. At least two engines. At least two or three electrical systems. Back up systems to back up systems. Airliners are designed under the assumption that anything mechanical can and will break. So they're designed to get the passengers down safely, even with multiple layers of failures. And to guard against the pilots using all those back-up systems to push the envelope of risk further, the airlines train their pilots to fly every moment of a flight by precise, practiced, and prescribed procedures. 

NASA's Dryden Flight Research Center at Edwards AFB -- which operates in the shadow of all those sober street names -- also relies heavily on checklist procedures. But unlike the airlines, which operate conservatively inside a well-explored and known environment, flight test is, by definition, an exploration into the unknown. It contains risk and uncertainty. The difference is, the NASA engineers at Dryden acknowledge that fact. And they've learned the hard way the value of dissenting voices. So each and every member of an engineering flight test team now has the authority to stop a flight, on their say-so alone, without recriminations. It's one of the reasons NASA has accumulated as good a safety record -- in aeronautical flight research -- as it has. But it's also why flight research at NASA takes longer to accomplish than in most smaller, entrepreneurial companies. There is a trade-off involved.

Perhaps the airlines and Dryden take a stronger approach to risk management because they have more memories of bad failures. Airplanes have been flying a lot longer than oil companies have been drilling in deep water. And failures not only provide opportunities and incentives to learn and improve procedures and thinking -- they also tend to change people's views about what's likely to happen in the future.

Risk is an elusive, and ultimately unconquerable, opponent. And there's a strong argument to be made, looking at the disaster in the Gulf, that perhaps we shouldn't be opening up the earth in places too remote to close it up again. But in any event, the oil industry would do well to take two lessons from the aeronautics community. First: anything mechanical can and will break -- and that includes back-up systems. Count on it. Second: expect the unexpected. And plan accordingly.   
Jump to comments
Presented by

Lane Wallace is a pilot and adventure writer. Her latest book is Surviving Uncertainty: Taking a Hero's Journey.

Get Today's Top Stories in Your Inbox (preview)

Saving Central: One High School's Struggle After Resegregation

Meet the students and staff at Tuscaloosa’s all-black Central High School in a short documentary film by Maisie Crow. 


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

From This Author

Just In