The Lessons of ValuJet 592

As a reconstruction of this terrible crash suggests, in complex systems some accidents may be "normal"—and trying to prevent them all could even make operations more dangerous


The Hunt for Blame

THE FAA's administrator then was a onetime airline boss named David Hinson—the sort of glib and self-assured executive who does well in closed circles of like-minded men. Now, however, he would have to address a diverse and skeptical audience. The day after the ValuJet accident he had flown to Miami and made the incredible assertion that ValuJet was a safe airline—when for 110 people lying dead in a nearby swamp it very obviously was not. He also said, "I would fly on it," as if he believed that he had to reassure a nation of children. It was an insulting performance, and it was taken as evidence of the FAA's isolation and of its betrayal of the public's trust.

After a good night's sleep Hinson might have tried to repair the damage. Instead he appeared two days later at a Senate hearing in Washington sounding like an unrepentant Prussian: "We have a very professional, highly dedicated, organized, and efficient inspector work force that do their job day in and day out. And when we say an airline is safe to fly, it is safe to fly. There is no gray area."

His colleagues must have winced. Aviation safety is nothing but a gray area, and the regulation of it is an indirect process of negotiation and maneuver. Consider the size of the airline business, the scale of the sky, and the loneliness of an airplane in flight. The FAA can affect safety by establishing standards and enforcing them through inspections and paperwork, but it cannot throw the switches or turn the wrenches, or in this case supervise the disposal of old oxygen generators. Safety is ultimately in the hands of the operators, the mechanics and pilots and their managers, because it involves a blizzard of small judgments. Hinson might have admitted this reality to the American public, which is certainly capable of understanding such subtleties, but instead, inexplicably, he chose to link the FAA's reputation to that of ValuJet. This placed the agency in an impossible position. Whether for incompetence or for cronyism, the FAA would now inevitably be blamed.

Within days it came out that certain inspectors at the FAA had been worried about ValuJet for some time and had described their concerns in their reports. Their consensus was that the airline was expanding too fast (from two to fifty-two airplanes over its two-and-a-half-year life) and that it had neither the procedures nor the people in place to maintain standards of safety. The FAA tried to keep pace, but because of its other commitments—including countering the threat of terrorism—it could assign only three inspectors to the airline. At the time of the accident they had run 1,471 routine checks on the operation and made two additional eleven-day inspections, in 1994 and 1995. This level of scrutiny was about normal. But by early 1996 concern had grown within the FAA about the disproportionate number of infractions committed by ValuJet and the string of small bang-ups it had had. The agency began to move more aggressively. An aircraft-maintenance group found such serious problems in both the FAA's surveillance and the airline's operations that it wrote an internal report recommending that ValuJet be "recertified" immediately—meaning that it be grounded and started all over again. The report was apparently sent to Washington, where for reasons that remain unexplained it lay buried until after the accident. Meanwhile, on February 22, 1996, headquarters launched a 120-day "special emphasis" inspection, a preliminary report on which was issued after the first week. This suggested a wide range of problems. The special-emphasis inspection was ongoing when, on May 11, Flight 592 went down.

As this record of official concern emerged, the question changed from why Hinson had insisted on calling ValuJet "safe" after the accident to why he had not shut down the airline before the accident. Trapped by his own simplistic formulations, he could provide no convincing answer. The press and Congress were sharply critical. The FAA launched an exhaustive thirty-day review of ValuJet, perhaps the most concentrated airline inspection in history, assigning sixty inspectors to perform in one month the equivalent of four years' work. Lewis Jordan, a founder and the president of ValuJet, complained that Hinson was, in effect, conducting a witch hunt that no airline could withstand. Jordan had been trying shamelessly to shift the blame for the deaths onto his own contractor, SabreTech, and he received little sympathy now. No one was surprised when ValuJet was grounded indefinitely five weeks after the accident.

Here now was proof that the FAA had earlier neglected its duties. The agency's chief regulator, Anthony Broderick, was the first to lose his job. Broderick was an expert technocrat, disliked by safety crusaders because of his conservative approach to instituting and applying regulations, and respected by aviation insiders for the same reason. Hinson let him take the fall: Broderick was a man of integrity and would accept responsibility for the FAA's poor performance. But if Hinson thought that he himself could escape with this sacrifice, he was wrong. Broderick's airline friends now joined the critics in disgust. Hinson announced his upcoming resignation.

In a sense, the system worked. The tragedy did have some positive consequences—primarily because the NTSB did an even better job than usual, not only pinpointing the source and history of the fire but also recognizing some of its larger implications. With a well-timed series of press feedings and public hearings the accident team kept the difficult organizational issues alive and managed to stretch the soul-searching through the end of the year and beyond. By shaking up the FAA, the team reminded the agency of its mandate to oversee the safety of the airlines—perhaps prodding the FAA into a renewed commitment to inspections and a resolution to hold airlines responsible for their actions and for the performance of outside shops.

For the airlines, the investigation served as a necessary reminder of the possible consequences of cost-cutting and complacency. Among airline executives smart enough to notice, it may also have served as a warning about the public's growing distrust of their motives and about widespread anger with the whole industry—anger that may have as much to do with the way passengers are handled as with their fears of dying. However one wants to read it, the ValuJet turmoil marked the limits of the public's tolerance. The airlines were cowed, and they submitted eagerly to the banning of oxygen generators as cargo on passenger flights. They then rushed ahead of the FAA with a $400 million promise (not yet fulfilled) to install fire detectors and extinguishers in all cargo holds. The desire to find hidden hazards runs up against the practical difficulties of inspecting cargo. Nonetheless, ground crews can be counted on for a while to watch what they load into airplanes and what they take out and throw away.

And the guilty companies? They lost money and were sued, of course. After firing the two mechanics who had falsely signed the work orders, SabreTech tried to put its house in order. Nonetheless, its customers fled and did not return. The Miami operation shrank from 650 to 135 employees, and in January of last year was forced to close its doors. Soon afterward, as the result of a two-month FAA investigation, SabreTech's new Orlando facility was forced to close as well. ValuJet survived its grounding, and under intense FAA scrutiny returned to the sky later in 1996, with a reduced and standardized fleet of DC-9s; it ultimately changed its name to AirTran. For a while it was probably the safest airline in the country. What, then, explains the feeling, particular to this case, that so little has in reality been achieved?

A "Normal Accident"

PILOTS are safety practitioners, steeped in a can-do attitude toward survival and confident in their own skills. We tend to think that man-made accidents must lie within human control. This idea has been encouraged to some extent by the work of a group of Berkeley professors—notably the political scientist Todd La Porte—who study "high-reliability organizations," meaning those with good track records at handling apparently hazardous technologies: aircraft carriers, air-traffic-control centers, Flyingcertain power companies. They believe that organizations can learn from past mistakes and can tailor themselves to achieve new objectives, and that if the right, albeit difficult, steps are taken, many accidents can be avoided.

Charles Perrow's thinking is more difficult for pilots like me to accept. Perrow came unintentionally to his theory about normal accidents after studying the failings of large organizations. His point is not that some technologies are riskier than others, which is obvious, but that the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur. Those failures will occasionally combine in unforeseeable ways, and if they induce further failures in an operating environment of tightly interrelated processes, the failures will spin out of control, defeating all interventions. The resulting accidents are inevitable, Perrow asserts, because they emerge from the venture itself. You cannot eliminate one without killing the other.

Perrow's seminal book Normal Accidents: Living With High-Risk Technologies (1984) is an unusual work—a hodgepodge of storytelling and exhortation, out of which this new way of thinking has risen. His central device is an organizational chart on which to plot the likelihood of serious system accidents. He does not append numerical values to the chart but uses a set of general risk indicators. In one quadrant stand the processes—like those of most manufacturing—that are simple, slow, linear, and visible, and in which the operators experience failures as isolated and containable events. In the opposite one stand the opaque and tangled processes characterized by a combination of what Perrow calls "interactive complexity" and "tight coupling." By "interactive complexity" he means not simply that there are many elements involved but that those elements are linked in multiple and often unpredictable ways. The failure of one part—whether material, psychological, or organizational—may coincide with the failure of an entirely different part, and this unforeseeable combination will cause the failure of other parts, and so on. If the system is large, the possible combinations of failures are practically infinite. Such unravelings seem to have an intelligence of their own: they expose hidden connections, neutralize redundancies, bypass "firewalls," and exploit chance circumstances that no engineer could have planned for. When the operating system is inherently quick and inflexible (like a chemical process, an automated response to missile attack, or a jet airliner in flight), the cascading failures can accelerate out of control, confounding the human operators and denying them a chance to jury-rig a recovery. That lack of slack is Perrow's tight coupling. Then the only difference between a harmless accident and a human tragedy may be a question, as in chemical plants, of which way the wind blows.

I ran across this thinking by chance, a year before the ValuJet crash, when I picked up a copy of Scott D. Sagan's book The Limits of Safety: Organizations, Accidents, and Nuclear Weapons (1993). Sagan, a Stanford political scientist who is a generation younger than Perrow, is the most persuasive of Perrow's interpreters, and with The Limits of Safety he has solidified system-accident thinking, focusing it more clearly than Perrow was able to. The Limits of Safety starts by placing high-reliability and normal-accident theories in opposition and then tests them against a laboriously researched and previously secret history of failures within U.S. nuclear-weapons programs. The test is a transparent artifice, but it serves to define the two theories. Sagan's obvious bias does not diminish his work.

Strategic nuclear weapons pose an especially difficult problem for system-accident thinking, for two reasons: first, there has never been an accidental nuclear detonation, let alone an accidental nuclear war; and second, if a real possibility of such an apocalyptic failure exists, it threatens the very logic of nuclear deterrence—the expectation of rational behavior on which we continue to base our arsenals. Once again the pursuit of system accidents leads to uncomfortable ends. Sagan is not a man to advocate disarmament, and he shies away from doing so in his book, observing realistically that nuclear weapons are here to stay. Nonetheless, once he has defined "accidents" as less than nuclear explosions (as false warnings, near launches, and other unanticipated breakdowns in this ultimate "high-reliability" system), Sagan discovers a pattern of accidents, some of which were contained only by chance. The reader is hardly surprised when Sagan concludes that such accidents are inevitable.

The book interested me not because of the accidents themselves but because of their pattern, which seemed strangely familiar. Though the pattern represented possibilities that I as a pilot had categorically rejected, this new perspective required me to face the unpredictable side of my own experience with the sky. I had to admit that some of my friends had died in crazy and unlucky ways, that some flights had gone uncontrollably wrong, and that perhaps not even the pilots were to blame. What is more, I had to admit that no matter how carefully I checked my own airplanes, and how cautiously I flew them, the same could happen to me.

That is where we stand now as a society with ValuJet Flight 592, and it may explain our continuing discomfort with the accident. The ValuJet case represents a nearly perfect system accident. It arose from a process that fits most of Perrow's technical requirements of unpredictability and interactive complexity and some of those of tight coupling. More important, it fits the most basic definitions of an accident caused by the very functioning of the system or industry within which it occurred. Flight 592 burned because of its cargo of oxygen generators, yes, but more fundamentally because of a tangle of confusions that will take some entirely different form next time. It is frustrating to fight such a thing, and wrongdoing is difficult to assign.

Presented by

William Langewiesche is a contributing editor of The Atlantic and the author of Sahara Unveiled (1996). His article in this issue will appear in his book Inside the Sky: A Meditation on Flight, to be published this spring by Pantheon Books. More

"Enclosed are Two Pieces on Algeria." With those words, typed on plain white bond, William Langewiesche introduced himself to the editors of The Atlantic Monthly. Although neither piece quite stood on its own, the editors were drawn to the unusual grace and power of Langewiesche's writing and sent him on assignment to North Africa for a more ambitious piece of reporting. The result was the November 1991, cover story, "The World in Its Extreme"—his first article to appear in a general-interest magazine. (He had, however, written frequently for aviation magazines; he is a professional pilot and first sat at the controls of an airplane at the age of five.) Since that article, from which his book Sahara Unveiled: A Journey Across the Desert (1996) grew, Langewiesche has reported on a diversity of subjects and published four more books.

A large part of Mr. Langewiesche's reporting experience centers around the Middle East and the Islamic world. He has traveled widely throughout the Middle East and Northern Africa, reporting on such topics as the implementation of the shari'a in Sudan under Hassan al-Tarabi, North Africa's Islamic culture, and the American occupation of Iraq. Other recent assignments have taken him to Egypt, the Balkans, India, and Central and South America. In 2004 he won a National Magazine Award for excellence in reporting.

In 2002 his book American Ground: Unbuilding The World Trade Center was published. It is based on a series of three cover stories he wrote for The Atlantic as the only American reporter granted full access to the World Trade Center clean-up effort. His latest book, The Outlaw Sea: A World of Freedom, Chaos, and Crime, was published in May 2004.

'I'm As Hardcore New York As You Can Possibly Get'

A short documentary about an electric violin player who busks in New York City's subway.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register with Disqus.

Please note that The Atlantic's account system is separate from our commenting system. To log in or register with The Atlantic, use the Sign In button at the top of every page.

blog comments powered by Disqus


The Horrors of Rat Hole Mining

"The river was our source of water. Now, the people won't touch it."


What's Your Favorite Slang Word?

From "swag" to "on fleek," tweens choose.


Cryotherapy's Dubious Appeal

James Hamblin tries a questionable medical treatment.


Confessions of Moms Around the World

In Europe, mothers get maternity leave, discounted daycare, and flexible working hours.


How Do Trees Know When It's Spring?

The science behind beautiful seasonal blooming

More in National

More back issues, Sept 1995 to present.

Just In