The Lessons of ValuJet 592
As a reconstruction of this terrible crash suggests, in complex systems some accidents may be "normal"—and trying to prevent them all could even make operations more dangerous

ON a muggy May afternoon in 1996 an emergency dispatcher in southern Florida got a call from a man on a cellular phone. The caller said, "Yes. I am fishing at Everglades Holiday Park, and a large jet aircraft has just crashed out here. Large. Like airliner-size."
The dispatcher said, "Wait a minute. Everglades Park?"
"Everglades Holiday Park, along canal L-sixty-seven. You need to get your choppers in the air. I'm a pilot. I have a GPS. I'll give you coordinates."
"Okay, sir. What kind of plane did you say? Is it a large plane?"
"A large aircraft similar to a seven-twenty-seven or a umm ... I can't think of it."
This lapse was unimportant. The caller was a born accident observer—a computer engineer and a private pilot with pride in his technical competence and a passion for detail. His name was Walton Little. When he first saw the airplane, it was banked steeply to the right and flying low, just above the swamp. Later he filed an official report, in which he stated,
There was no smoke, no strange engine noise, no debris in the air, no dangling materials or control surfaces, no apparent deformation of the airframe, and no areas that appeared to have missing panels or surfaces.... Sunlight was shining on the aircraft, and some surfaces were more reflective and some less reflective. I saw a difference in reflection of the wing skin in the area where I would expect the ailerons to be, as though they were not neutral. In particular, the lower (outboard) portion of the right wing appeared less reflective as though the aileron was deflected upward.
Nearby fishermen ducked into their boat for cover—but not Walton Little, who stood on his deck, facing "about 115 degrees," and watched the airplane hit the water. The shock wave passed through his body.
I was in disbelief that the crash had occurred. I stood there for just a moment to consider that it really did happen. I was already thinking that I needed to get my cellular phone out of the storage compartment and call 911, but I wanted to assure myself of what I was doing because it is against the law to make false calls to 911.
He called within a minute. After telling the dispatcher about the crash and reading off his latitude and longitude, he said, "I'm in a bass boat on the canal. I thought it was an aircraft from an air show or something, and..."
The dispatcher interrupted. "What did you ... Did you see flames and stuff come up, sir?"
"I heard the impact, and I saw dirt and mud fly in the air. The plane was sideways before it went out of my sight on the horizon about a mile from me."
"Yes, sir. Okay. You said it looked like a seven-twenty-seven that went down?"
"Uh, it's that type aircraft. It has twin engines in the rear. It is larger than an executive jet, like a Learjet."
"Yes, sir."
"It's much bigger than that. I won't tell you it's a seven-twenty-seven, but it's that type aircraft. No engines on the wing, two engines in the rear. I do not see any smoke, but I saw a tremendous cloud of mud and dirt go into the sky when it hit."
"Okay, sir."
"It was white with blue trim."
"White with blue trim, sir?"
"It will not be in one piece."
Walton Little was right. The airplane was a twin-engine DC-9 painted the colors of ValuJet, an aggressive young discount airline based in Atlanta. When it hit the Everglades, it was banked vertically to the right and pointed nearly straight down. The airplane did not sink mysteriously into the swamp, as reports later suggested, but shattered as it hit the surface with the furious force of a fast dive.
By the time Walton Little felt the shock wave, everyone aboard was dead—two pilots, three flight attendants, and 105 passengers. Their remains lay in a shallow, watery crater filled with liquid mud and grass. All that marked the surface was a fractured engine, a few dead fish, some jet fuel, and a scattering of personal papers, clothes, and twisted pieces of aluminum—the stuff of tragedy. During those first few days some officials worried aloud about the accident's effect on nature, but the swamp was not so fragile as that, and quickly resumed its usual life. The families of those who died have proved less resilient. Most will feel the poison forever.

Consider, for simplicity, that there are three kinds of airplane accidents. The most common ones might be called "procedural." They are those old-fashioned accidents that result from single obvious mistakes, that can immediately be understood in simple terms, and that have simple resolutions. To avoid such accidents pilots must not fly into violent thunderstorms, or take off with ice on their wings, or descend prematurely, or let fear or boredom gain the upper hand. Mechanics, ramp agents, and air-traffic controllers must observe equally simple rules. As practitioners, we have together learned many painful lessons.
The second kind of accident could be called "engineered." It consists of those surprising materials failures that should have been predicted by designers or discovered by test pilots but were not. Such failures at first defy understanding, but ultimately they yield to examination and result in tangible solutions. An American Eagle ATR turboprop dives into a frozen field in Roselawn, Indiana, because its de-icing boots did not protect its wings from freezing rain—and as a result new boots are designed, and the entire testing process undergoes review. A USAir Boeing 737 crashes near Pittsburgh because of a rare hard-over rudder movement—and as a result a redesigned rudder-control mechanism will be installed on the whole fleet. A TWA Boeing 747 blows apart off New York because, whatever the source of ignition, its nearly empty center tank contained an explosive mixture of fuel and air—and as a result explosive mixtures may in the future be avoided. Such tragic failures seem all too familiar, but in fact they are rare, and they will grow rarer still as aeronautical engineering improves. One can regret the lives lost and deplore the slowness with which officials respond, but in the long run there is reason to be optimistic. The Wright brothers were products of the Enlightenment. Our science will prevail.
The ValuJet accident is different. I would argue that it represents the third and most elusive kind of disaster, a "system accident," which may lie beyond the reach of conventional solution, and which a small group of thinkers, inspired by the Yale sociologist Charles Perrow, has been exploring elsewhere—for example, in power generation, chemical manufacturing, nuclear-weapons control, and space flight. Perrow has coined the more loaded term "normal accident" for such disasters, because he believes that they are normal for our time. His point is that these accidents are science's illegitimate children, bastards born of the confusion that lies within the complex organizations with which we manage our dangerous technologies. Perrow is not an expert on commercial flying, but his thinking applies to it nonetheless. In this case the organization includes not only ValuJet, the archetype of new-style airlines, but also the contractors that serve it and the government entities that, despite economic deregulation, are expected to oversee it. Taken as a whole, the airline system is complex indeed.

We can find fault among those directly involved—and we probably need to. But if our purpose is to attack the roots of such an accident, we may find them so entwined with the system that they are impossible to extract without toppling the whole structure. In the case of ValuJet the study of system accidents presents us with the possibility that we have come to depend on flight, that unless we are willing to end our affordable airline system as we know it, we cannot stop the occasional sacrifice. Beyond the questions of blame, it requires us to consider that our solutions, by adding to the complexity and obscurity of the airline business, may actually increase the risk of accidents. System-accident thinking does not demand that we accept our fate without a struggle, but it serves as an important caution.
THE distinction among procedural, engineered, and system accidents is of course not absolute. Most accidents are a bit of each. And even in the most extreme cases of system failure the post-crash investigation must work its way forward conventionally, usefully identifying those problems that can be fixed, before the remaining questions begin to force a still-deeper examination. That was certainly the way with ValuJet Flight 592.
It was headed from Miami to Atlanta, flown by Captain Candalyn Kubeck, age thirty-five, and her copilot Richard Hazen, age fifty-two. They represented a new kind of commercial pilot, experienced not only in the cockpit but in the rough-and-tumble of the deregulated airline industry, where both had held a number of low-paid flying jobs before settling on ValuJet. It would have been no shock to them that ValuJet pilots were non-unionized, or that the company required them to pay for their own training. With 9,000 flight hours behind her, more than 2,000 of them in a DC-9, Kubeck earned what the free market said she was worth—about $43,000 a year, plus bonuses. Hazen, formerly in the Air Force and with similar experience, earned a bit more than half as much.
Pilots were not the only low-paid employees at ValuJet—flight attendants, ramp agents, and mechanics made a lot less there than they would have at a more traditional airline. So much work was farmed out to temporary employees and independent contractors that ValuJet was sometimes called a "virtual airline." FAA regulators had begun to worry that the company was moving too fast, and not keeping up with its paperwork, but there was no evidence that the people involved were inadequate. Many of the pilots were refugees from the labor wars at the old Eastern Airlines, and they were generally as competent and experienced as their higher-paid friends at United, American, and Delta. ValuJet was helping the entire industry to understand just how far cost-cutting could be pushed. Its flights were cheap and full, and its stock was strong on Wall Street.
But six minutes out of Miami, while climbing northwest through 11,000 feet, Richard Hazen radioed, "Ah, five-ninety-two needs an immediate return to Miami." In the deliberate calm of pilot talk this was strong language. The time was thirty-one seconds after 2:10 P.M., and the sun was shining. Something had gone wrong with the airplane.
The radar controller at Miami Departure answered immediately. Using ValuJet's radio name "Critter" (for the company's cartoonish logo—a smiling airplane), he gave the flight clearance to turn initially toward the west, away from Miami and conflicting traffic flows, and to begin a descent to the airport. "Critter five-ninety-two, ah roger, turn left heading two-seven-zero, descend and maintain seven thousand."
Hazen said, "Two-seven-zero, seven thousand, five-ninety-two."
The controller was Jesse Fisher, age thirty-six, a seven-year veteran, who had twice handled the successful return of an airliner that had lost cabin pressurization. He had worked the night before, and had gone home, fed his cat, and slept well. He felt alert and rested. He said, "What kind of problem are you having?"
Hazen said, "Ah, smoke in the cockpit. Smoke in the cabin." His tone was urgent.
Fisher kept his own tone flat. He said, "Roger." Over his shoulder he called, "I need a supervisor here!"
The supervisor plugged in beside him. On Fisher's radar screen Flight 592 appeared as a little oval and an associated group of numbers, including a readout of its altitude. Fisher noticed that the airplane had not yet started to turn. He gave the pilots another heading, farther to the left, and cleared them down to 5,000 feet.
Aboard the airplane Hazen acknowledged the new heading but misheard the altitude assignment. It didn't matter. Flight 592 was burning, and the situation in the cockpit was rapidly getting out of hand. One minute into the emergency the pilots were still tracking away from Miami, and had not begun their return. Hazen said, "Critter five-ninety-two, we need the, ah, closest airport available."
The transmission was garbled or blocked, or Fisher was distracted by competing voices within the radar room. For whatever reason, he did not hear Hazen's request. When investigators later asked him if in retrospect he would have done anything differently, he admitted that he kept asking himself the same question. Even without hearing Hazen's request he might have suggested some slightly closer airport. But given that the flight's position was only twenty-five miles to the northwest, Miami still seemed like the best choice, because of the emergency equipment there. In any case "Miami" was the request he had heard, and he intended to deliver it.
To Hazen he said, "Critter five-ninety-two, they're gonna be standing, standing by for you." He meant the crash crews at Miami. "You can plan Runway One-two. When able, direct to Dolphin now."
Hazen said, "... need radar vectors." His transmission was garbled by loud background noises. Fisher thought he sounded "shaky."
Fisher answered, "Critter five-ninety-two, turn left heading one-four-zero."
Hazen said, "One-four-zero." It was his last coherent response.
The flight had only now begun to move through a gradual left turn. Fisher watched the target on his screen as it tracked through the heading changes: the turn tightened and then slowed again. With each sweep of the radar beam the altitude readouts showed a gradual descent—8,800, 8,500, 8,100. Two minutes into the crisis Fisher said, "Critter five-ninety-two, keep the turn around, heading ah one-two-zero."
Flight 592 may have tried to respond—someone keyed a microphone without talking.
Fisher said, "Critter five-ninety-two, contact Miami Approach on—correction, no, you just keep on my frequency."
Two and a half minutes had gone by. It was 2:13 P.M. The airplane was passing through 7,500 feet when suddenly it tightened the left turn and entered a steep dive. Fisher's radar showed the turn and an altitude readout of XXX—code for such a rapid altitude change that the computer cannot keep up. Investigators later calculated that the airplane rolled to a sixty-degree left bank and dove 6,400 feet in thirty-two seconds. During that loss of control Fisher radioed mechanically, "Critter five-ninety-two, you can, ah, turn left, heading one-zero-zero, and join the Runway One-two localizer at Miami." He also radioed, "Critter five-ninety-two, descend and maintain three thousand."
Then the incredible happened. The airplane rolled wings-level again and pulled sharply out of its dive. It is highly unlikely that the airplane would have done this on its own. It is possible that the autopilot kicked in, or that one of the pilots, having been incapacitated by smoke or defeated by melting control cables, somehow momentarily regained control. Fisher watched the radar target straighten toward the southeast, and again read out a nearly level altitude—now, however, merely a thousand feet. The airplane's speed was almost 500 miles an hour.
The frequency crackled with another unintelligible transmission. Shocked into the realization that the airplane would be unable to make Miami, Fisher said, "Critter five-ninety-two, Opa-Locka Airport's about ah twelve o'clock at fifteen miles."
Walton Little, in his bass boat, spotted the airplane then, as it rolled steeply to the right. The radar, too, noticed that last quick turn toward the south, just before the final nose-over. On the next sweep of the radar the flight's data block went into "coast" on Fisher's screen, indicating that contact had been lost. The supervisor marked the spot electronically and launched rescue procedures.
Fisher continued to work the other airplanes in his sector. Five minutes after the impact another low-paid pilot, this one for American Eagle, radioed, "Ah, how did Critter make out?" Fisher didn't answer.

IT was known from the start that fire took the airplane down. The federal investigation began within hours, with the arrival that evening of a National Transportation Safety Board team from Washington. The investigators set up shop in an airport hotel, which they began to refer to as the "command post." The language is important. As we will see, similar forms of linguistic stiffness, specifically engineerspeak, ultimately proved to have been involved in the downing of Flight 592—and this is a factor that the NTSB investigators, because of their own verbal awkwardness, have been unable quite to recognize.
It is not reasonable to blame them for this, though. The NTSB is a technical agency, staffed by technicians, which occupies a central position in the stilted world of aviation. Its job is to examine important accidents and to issue nonbinding safety recommendations—opinions, really—to industry and government. Because the investigators have no regulatory authority and must rely on persuasion to influence events, it may at times be necessary for them to use official-sounding language. Even among its opponents, who often feel that its recommendations are impractical, the NTSB has a reputation for technical competence. The NTSB is a piece of engineering done right. In a world built on compromise, it manages to play the old-fashioned, unambiguous role of the public's defender.
The press plays a more difficult role, though one equally important to the public's safety. It has a classically symbiotic relationship with the NTSB, relying on the investigators for information while providing them with their only effective voice. Nonetheless, in the time of crisis immediately after an accident, a tension exists between the two. Working under pressure to get the story out, reporters resent the caution of the investigators and their reluctance to speculate anonymously. Working under pressure to get the story right, investigators, for their part, resent the reporters' incessant demands during the difficult first days of an accident probe—the recovery of human remains and airplane parts. By the time I got to Miami, nineteen hours after Flight 592 hit the swamp, the two camps had assumed their habitual positions and were passing each other warily in the hotel lobby.
Twenty miles to the northwest, deep in the Everglades, the recovery operation was already under way. The NTSB had set up a staging area—a "forward ops base," one official called it—beside the Tamiami Trail, a two-lane highway that traverses the watery grasslands of southern Florida. Within two days this staging area blossomed into a chaotic encampment of excited officials—local, state, and federal—with their tents and air-conditioned trailers, their helicopters, their cars and flashing lights. I quit counting the agencies. The NTSB had politely excluded most of them from the actual accident site, which lay seven miles north, along a narrow levee road.
The press was excluded even from the staging area, but was provided with two news conferences a day, during which investigators cautiously doled out tidbits of information. One NTSB official said to me, "We've got to feed them or we'll lose control." But the reporters were well behaved, and if anything a bit overcivilized. Near the staging area they settled into their own little town of television trucks, tents, and lawn chairs. The location gave them good Everglades backdrops and shots of alligators swimming by; the viewing public could not have guessed that they stood so far from the action. They acted impatient, but in truth this was not a bad assignment; at its peak their little town boasted pay phones and pizza delivery.
Maybe it was because of my obvious lack of deadline that the investigators made an exception in my case. They slipped me into the front seat of a Florida Game and Fish helicopter whose pilot, in a fraternal gesture, invited me to take the controls for the run out to the crash site. From the staging area we skimmed north across the swamped grasslands, loosely following the levee road, before swinging wide to circle over the impact zone—a new pond defined by a ring of turned mud and surrounded by a larger area of grass and water and accident debris. Searchers in white protective suits waded side by side through the muck, piling pieces of people and airplane into flat-bottomed boats. It was hot and unpleasant work performed in a contained little hell, a place that one investigator later described to me as reeking of fuel, earth, and rotting flesh—the special smell of an airplane accident. We descended onto the levee, about 300 yards away from the crash site, where an American flag and a few tents and trucks constituted the recovery base.

It was, of course, a somber place to be. Human remains lay bagged in a refrigerated truck for later transport to the morgue. A decontamination crew washed down torn and twisted pieces of airplane, none longer than several feet. Investigators tagged the most promising wreckage, to be trucked immediately to a hangar at an outlying Miami airport, where specialists could study it. Farther down the levee I came upon a soiled photograph of a young woman with a small-town face and a head of teased hair. A white-suited crew arrived on an airboat and clambered up the embankment to be washed down. Another crew set off. A boatload of muddy wreckage arrived. The next day the families of the dead came on buses, and laid flowers and cried. Pieces of the airplane kept being hauled up for nearly another month.
Much was made of this recovery, which—prior to the offshore retrieval of TWA's Flight 800—the NTSB called the most challenging in its history. It is true that the swamp made the search slow and difficult, and that the violence of the impact meant that meticulous work was required to reconstruct the critical forward cargo hold. However, it is also true that the physical part of the investigation served to confirm what a look at a shipping ticket had already suggested—that ValuJet Flight 592 burned and crashed not because the airplane failed but, in large part, because the airline did.
To me as a pilot, the most impressive aspect of the investigation was the speed with which it worked through the false pursuit of an electrical fire—an explanation supported by my own experiences in flight, and all the more plausible here because the ValuJet DC-9 was old and had experienced a variety of electrical failures earlier the same day, including a tripped circuit breaker that had resisted the attentions of a mechanic in Atlanta, and then mysteriously had fixed itself. I was impressed also by the instincts of the reporters, who for all their technical ignorance seized on the news that Flight 592 had been loaded with a potentially dangerous cargo of chemical oxygen generators—more than a hundred little firebombs that could have caused this accident, and that indeed did.
Flight 592 crashed on a Saturday afternoon. By Sunday the recovery teams were pulling up scorched and soot-stained pieces. On Monday a searcher happened to step on the flight-data recorder, one of two required black boxes meant to help with accident investigations. The NTSB took the recorder to its Washington laboratory and found that a blip in the flight data six minutes after Flight 592's takeoff seemed to indicate a momentary rise in air pressure. Immediately afterward the recorder began to fail intermittently, apparently because of electrical-power interruptions. On Tuesday night, at a press conference at the hotel, Robert Francis, the vice-chairman of the NTSB and the senior official on the scene, announced in a monotone, "There could have been an explosion." A hazardous-materials team would be joining the investigation. The investigation was focusing on the airplane's forward cargo hold, which was located just below and behind the cockpit, and was unequipped with fire detection and extinguishing systems. Routine paperwork indicated that the Miami ground crew had loaded the hold with homeward-bound ValuJet "company material," a witch's brew of three tires—at least two of them mounted—and five cardboard boxes of old oxygen generators.
OXYGEN generators are safety devices. They are small steel canisters mounted in airplane ceilings and seatbacks and linked to the flimsy oxygen masks that dangle in front of passengers when a cabin loses pressurization. To activate oxygen flow the passenger pulls a lanyard, which slides a retaining pin from a spring-loaded hammer, which falls on a minute explosive charge, which sparks a chemical reaction that liberates the oxygen within the sodium-chlorate core. This reaction produces heat, which may cause the surface temperature of the canister to rise to 500° Fahrenheit if the canister is mounted correctly in a ventilated bracket, and much higher if it is sealed in a box with other canisters, which may themselves be heating up. If there is a good source of fuel nearby, such as tires and cardboard boxes, the presence of pure oxygen will cause the canisters to burn ferociously. Was there an explosion on Flight 592? Perhaps. But in any event the airplane was blowtorched into the ground.
It is ironic that the airplane's own emergency-oxygen system was different—a set of simple oxygen tanks, similar to those used in hospitals, that do not emit heat during use. The oxygen generators in Flight 592's forward cargo hold came from three MD-80s, a more modern kind of twin jet, which ValuJet had recently bought and was having refurbished at a hangar across the airport in Miami. As was its practice for most maintenance, ValuJet had hired an outside company to do the job—in this case a large firm called SabreTech, owned by Sabreliner, of St. Louis, and licensed by the FAA to perform the often critical work. SabreTech, in turn, hired contract mechanics from other companies on an as-needed basis. It later turned out that three fourths of the people on the project were just such temporary outsiders. The vulnerability of American wageworkers could be sensed in their testimony after the accident. They inhabited a world of boss men and sudden firings, with few protections or guarantees for the future. As the ValuJet deadline approached, they worked in shifts, day and night, and sometimes through the weekend as well. It was their contribution to our cheap flying.
We will never know everyone at fault in this story. ValuJet gave the order to replace oxygen generators on the MD-80s, most of which had come to the end of their licensed lifetimes. It provided SabreTech with explicit removal procedures and general warnings about the dangers of fire. Over several weeks SabreTech workers extracted the generators and taped or cut off their lanyards before stacking most of them in five cardboard boxes that happened to be lying around the hangar. Apparently they believed that securing the lanyards would keep the generators from being fired inadvertently. What they did not do was place the required plastic safety caps over the firing pins—a precaution spelled out on the second line of ValuJet's written work order. The problem for SabreTech was that no one had such caps, or cared much about finding them. Ultimately the caps were forgotten or ignored. At the end of the job, in the rush to complete batches of paperwork on all three MD-80s, two mechanics routinely "pencil-whipped" the problem by signing off on the safety-cap line as well as on the others, certifying that the work had been done. SabreTech inspectors and supervisors signed off on the work too, apparently without giving the caps much thought.
The timing is not clear. For weeks the five boxes stood on a parts rack beside the airplanes. Eventually mechanics lugged them over to SabreTech's shipping-and-receiving department, where they sat on the floor in the area designated for ValuJet property. A few days before the accident a SabreTech manager told the shipping clerk to clean up the area and get all the boxes off the floor in preparation for an upcoming inspection by Continental Airlines, a potential customer. The boxes were unmarked, and the manager did not care what was in them.
The shipping clerk then did what shipping clerks do, and prepared to send the oxygen generators home to ValuJet headquarters, in Atlanta. He redistributed them equally among the five boxes, laying the canisters horizontally end to end, and packing bubble wrap on top. After sealing the boxes he applied address labels and ValuJet company-material stickers, and wrote "aircraft parts." As part of the load he included two large main tires and a smaller nose tire—at least two of which were mounted on wheels. The next day he asked a co-worker, the receiving clerk, to make out a shipping ticket, and to write "oxygen canisters—empty" on it. The receiving clerk wrote "Oxy Canisters" and then put "Empty" between quotation marks, as if he did not believe it. He also listed the tires.
The cargo stood for another day or two, until May 11, when the SabreTech driver had time to deliver the boxes across the airport to Flight 592. There the ValuJet ramp agent accepted the material, though federal regulations forbade him to, even if the generators were empty, since canisters that have been discharged contain a toxic residue, and ValuJet was not licensed to carry any such officially designated hazardous materials. He discussed the cargo's weight with the copilot, Richard Hazen, who also should have known better. Together they decided to place the load in the forward hold, where ValuJet workers laid one of the big main tires flat, placed the nose tire at the center of it, and stacked the five boxes on top of it around the outer edge, in a loose ring. They leaned the other main tire against a bulkhead. It was an unstable arrangement. No one knows exactly what happened then, but it seems likely that the first oxygen generator ignited during the loading or during taxiing or on takeoff, as the airplane climbed skyward.
Two weeks later and halfway through the recovery of the scorched and shattered parts a worker finally found the airplane's cockpit voice recorder, the second black box sought by the investigators. It had recorded normal sounds and conversation up to the moment—six minutes after takeoff—when the flight-data recorder indicated a pulse of high pressure. The pulse may have been one of the tires exploding. In the cockpit it sounded like a chirp and a simultaneous beep on the public-address system. The captain, Candalyn Kubeck, asked, "What was that?"
Hazen said, "I don't know."
They scanned the airplane's instruments and found sudden indications of electrical failure. It was not the cause but a symptom of the inferno in the hold—the wires and electrical panels were probably melting and burning along with other, more crucial parts of the airplane—but the pilots' first thought was that the airplane was merely up to its circuit-breaking tricks again. The recording here is garbled. Kubeck seems to have asked, "About to lose a bus?" Then, more clearly, she said, "We've got some electrical problem."
Hazen said, "Yeah. That battery charger's kickin' in. Oooh, we gotta ..."
"We're losing everything," Kubeck said. "We need, we need to go back to Miami."
Twenty seconds had passed since the strange chirp in the cockpit. A total electrical failure, though serious, was not in those sunny conditions a life-threatening emergency. But suddenly there was incoherent shouting from the passenger cabin, and women and men screaming, "Fire!" The shouting continued for thirteen seconds and then subsided.
Kubeck said, "To Miami," and Hazen put in the call to Jesse Fisher, the air-traffic controller. When Fisher asked, "What kind of problem are you having?" Kubeck answered, off-radio, "Fire," and Hazen transmitted his urgent "Smoke in the cockpit. Smoke in the cabin."
Investigators now presume that the smoke was black and thick, and perhaps poisonous. The recorder picked up the sound of the cockpit door opening, and the voice of the chief flight attendant, who said, "Okay, we need oxygen. We can't get oxygen back there." Did she mean that the airplane's cabin masks had not dropped, or that they had dropped but were not working? If the smoke was poisonous, the masks might not have helped much, since by design they mix cabin air into the oxygen flow. The pilots were equipped with better, isolating-type masks and with goggles, but may not have had time to put them on. Only a minute had passed since the first strange chirp. Now the voice recorder captured the sound of renewed shouting from the cabin. In the cockpit the flight attendant said, "Completely on fire."
The recording was of little use to the NTSB's technical investigation, but because it showed that the passengers had died in agony, it added emotional weight to a political reaction that was already spreading beyond the details of the accident and that had begun to call the entire airline industry into question. The public, it seemed, would not be placated this time by standard reassurances and the discovery of a culprit or two. The press and the NTSB had put aside their on-site antagonism and had joined forces in a natural coalition with Congress. The questioning was motivated not by an immediate fear of unsafe skies (despite the warnings of Mary Schiavo, a federal whistle-blower who claimed special insight) but rather by a more nuanced suspicion that competition in the open sky had gone too far, and that the FAA, the agency charged with protecting the flying public, had fallen into the hands of industry insiders.

THE FAA's administrator then was a onetime airline boss named David Hinson—the sort of glib and self-assured executive who does well in closed circles of like-minded men. Now, however, he would have to address a diverse and skeptical audience. The day after the ValuJet accident he had flown to Miami and made the incredible assertion that ValuJet was a safe airline—when for 110 people lying dead in a nearby swamp it very obviously was not. He also said, "I would fly on it," as if he believed that he had to reassure a nation of children. It was an insulting performance, and it was taken as evidence of the FAA's isolation and of its betrayal of the public's trust.
After a good night's sleep Hinson might have tried to repair the damage. Instead he appeared two days later at a Senate hearing in Washington sounding like an unrepentant Prussian: "We have a very professional, highly dedicated, organized, and efficient inspector work force that do their job day in and day out. And when we say an airline is safe to fly, it is safe to fly. There is no gray area."
His colleagues must have winced. Aviation safety is nothing but a gray area, and the regulation of it is an indirect process of negotiation and maneuver. Consider the size of the airline business, the scale of the sky, and the loneliness of an airplane in flight. The FAA can affect safety by establishing standards and enforcing them through inspections and paperwork, but it cannot throw the switches or turn the wrenches, or in this case supervise the disposal of old oxygen generators. Safety is ultimately in the hands of the operators, the mechanics and pilots and their managers, because it involves a blizzard of small judgments. Hinson might have admitted this reality to the American public, which is certainly capable of understanding such subtleties, but instead, inexplicably, he chose to link the FAA's reputation to that of ValuJet. This placed the agency in an impossible position. Whether for incompetence or for cronyism, the FAA would now inevitably be blamed.
Within days it came out that certain inspectors at the FAA had been worried about ValuJet for some time and had described their concerns in their reports. Their consensus was that the airline was expanding too fast (from two to fifty-two airplanes over its two-and-a-half-year life) and that it had neither the procedures nor the people in place to maintain standards of safety. The FAA tried to keep pace, but because of its other commitments—including countering the threat of terrorism—it could assign only three inspectors to the airline. At the time of the accident they had run 1,471 routine checks on the operation and made two additional eleven-day inspections, in 1994 and 1995. This level of scrutiny was about normal. But by early 1996 concern had grown within the FAA about the disproportionate number of infractions committed by ValuJet and the string of small bang-ups it had had. The agency began to move more aggressively. An aircraft-maintenance group found such serious problems in both the FAA's surveillance and the airline's operations that it wrote an internal report recommending that ValuJet be "recertified" immediately—meaning that it be grounded and started all over again. The report was apparently sent to Washington, where for reasons that remain unexplained it lay buried until after the accident. Meanwhile, on February 22, 1996, headquarters launched a 120-day "special emphasis" inspection, a preliminary report on which was issued after the first week. This suggested a wide range of problems. The special-emphasis inspection was ongoing when, on May 11, Flight 592 went down.
As this record of official concern emerged, the question changed from why Hinson had insisted on calling ValuJet "safe" after the accident to why he had not shut down the airline before the accident. Trapped by his own simplistic formulations, he could provide no convincing answer. The press and Congress were sharply critical. The FAA launched an exhaustive thirty-day review of ValuJet, perhaps the most concentrated airline inspection in history, assigning sixty inspectors to perform in one month the equivalent of four years' work. Lewis Jordan, a founder and the president of ValuJet, complained that Hinson was, in effect, conducting a witch hunt that no airline could withstand. Jordan had been trying shamelessly to shift the blame for the deaths onto his own contractor, SabreTech, and he received little sympathy now. No one was surprised when ValuJet was grounded indefinitely five weeks after the accident.
Here now was proof that the FAA had earlier neglected its duties. The agency's chief regulator, Anthony Broderick, was the first to lose his job. Broderick was an expert technocrat, disliked by safety crusaders because of his conservative approach to instituting and applying regulations, and respected by aviation insiders for the same reason. Hinson let him take the fall: Broderick was a man of integrity and would accept responsibility for the FAA's poor performance. But if Hinson thought that he himself could escape with this sacrifice, he was wrong. Broderick's airline friends now joined the critics in disgust. Hinson announced his upcoming resignation.
In a sense, the system worked. The tragedy did have some positive consequences—primarily because the NTSB did an even better job than usual, not only pinpointing the source and history of the fire but also recognizing some of its larger implications. With a well-timed series of press feedings and public hearings the accident team kept the difficult organizational issues alive and managed to stretch the soul-searching through the end of the year and beyond. By shaking up the FAA, the team reminded the agency of its mandate to oversee the safety of the airlines—perhaps prodding the FAA into a renewed commitment to inspections and a resolution to hold airlines responsible for their actions and for the performance of outside shops.
For the airlines, the investigation served as a necessary reminder of the possible consequences of cost-cutting and complacency. Among airline executives smart enough to notice, it may also have served as a warning about the public's growing distrust of their motives and about widespread anger with the whole industry—anger that may have as much to do with the way passengers are handled as with their fears of dying. However one wants to read it, the ValuJet turmoil marked the limits of the public's tolerance. The airlines were cowed, and they submitted eagerly to the banning of oxygen generators as cargo on passenger flights. They then rushed ahead of the FAA with a $400 million promise (not yet fulfilled) to install fire detectors and extinguishers in all cargo holds. The desire to find hidden hazards runs up against the practical difficulties of inspecting cargo. Nonetheless, ground crews can be counted on for a while to watch what they load into airplanes and what they take out and throw away.
And the guilty companies? They lost money and were sued, of course. After firing the two mechanics who had falsely signed the work orders, SabreTech tried to put its house in order. Nonetheless, its customers fled and did not return. The Miami operation shrank from 650 to 135 employees, and in January of last year was forced to close its doors. Soon afterward, as the result of a two-month FAA investigation, SabreTech's new Orlando facility was forced to close as well. ValuJet survived its grounding, and under intense FAA scrutiny returned to the sky later in 1996, with a reduced and standardized fleet of DC-9s; it ultimately changed its name to AirTran. For a while it was probably the safest airline in the country. What, then, explains the feeling, particular to this case, that so little has in reality been achieved?
PILOTS are safety practitioners, steeped in a can-do attitude toward survival and confident in their own skills. We tend to think that man-made accidents must lie within human control. This idea has been encouraged to some extent by the work of a group of Berkeley professors—notably the political scientist Todd La Porte—who study "high-reliability organizations," meaning those with good track records at handling apparently hazardous technologies: aircraft carriers, air-traffic-control centers, certain power companies. They believe that organizations can learn from past mistakes and can tailor themselves to achieve new objectives, and that if the right, albeit difficult, steps are taken, many accidents can be avoided.
Charles Perrow's thinking is more difficult for pilots like me to accept. Perrow came unintentionally to his theory about normal accidents after studying the failings of large organizations. His point is not that some technologies are riskier than others, which is obvious, but that the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur. Those failures will occasionally combine in unforeseeable ways, and if they induce further failures in an operating environment of tightly interrelated processes, the failures will spin out of control, defeating all interventions. The resulting accidents are inevitable, Perrow asserts, because they emerge from the venture itself. You cannot eliminate one without killing the other.
Perrow's seminal book Normal Accidents: Living With High-Risk Technologies (1984) is an unusual work—a hodgepodge of storytelling and exhortation, out of which this new way of thinking has risen. His central device is an organizational chart on which to plot the likelihood of serious system accidents. He does not append numerical values to the chart but uses a set of general risk indicators. In one quadrant stand the processes—like those of most manufacturing—that are simple, slow, linear, and visible, and in which the operators experience failures as isolated and containable events. In the opposite one stand the opaque and tangled processes characterized by a combination of what Perrow calls "interactive complexity" and "tight coupling." By "interactive complexity" he means not simply that there are many elements involved but that those elements are linked in multiple and often unpredictable ways. The failure of one part—whether material, psychological, or organizational—may coincide with the failure of an entirely different part, and this unforeseeable combination will cause the failure of other parts, and so on. If the system is large, the possible combinations of failures are practically infinite. Such unravelings seem to have an intelligence of their own: they expose hidden connections, neutralize redundancies, bypass "firewalls," and exploit chance circumstances that no engineer could have planned for. When the operating system is inherently quick and inflexible (like a chemical process, an automated response to missile attack, or a jet airliner in flight), the cascading failures can accelerate out of control, confounding the human operators and denying them a chance to jury-rig a recovery. That lack of slack is Perrow's tight coupling. Then the only difference between a harmless accident and a human tragedy may be a question, as in chemical plants, of which way the wind blows.
I ran across this thinking by chance, a year before the ValuJet crash, when I picked up a copy of Scott D. Sagan's book The Limits of Safety: Organizations, Accidents, and Nuclear Weapons (1993). Sagan, a Stanford political scientist who is a generation younger than Perrow, is the most persuasive of Perrow's interpreters, and with The Limits of Safety he has solidified system-accident thinking, focusing it more clearly than Perrow was able to. The Limits of Safety starts by placing high-reliability and normal-accident theories in opposition and then tests them against a laboriously researched and previously secret history of failures within U.S. nuclear-weapons programs. The test is a transparent artifice, but it serves to define the two theories. Sagan's obvious bias does not diminish his work.
Strategic nuclear weapons pose an especially difficult problem for system-accident thinking, for two reasons: first, there has never been an accidental nuclear detonation, let alone an accidental nuclear war; and second, if a real possibility of such an apocalyptic failure exists, it threatens the very logic of nuclear deterrence—the expectation of rational behavior on which we continue to base our arsenals. Once again the pursuit of system accidents leads to uncomfortable ends. Sagan is not a man to advocate disarmament, and he shies away from doing so in his book, observing realistically that nuclear weapons are here to stay. Nonetheless, once he has defined "accidents" as less than nuclear explosions (as false warnings, near launches, and other unanticipated breakdowns in this ultimate "high-reliability" system), Sagan discovers a pattern of accidents, some of which were contained only by chance. The reader is hardly surprised when Sagan concludes that such accidents are inevitable.
The book interested me not because of the accidents themselves but because of their pattern, which seemed strangely familiar. Though the pattern represented possibilities that I as a pilot had categorically rejected, this new perspective required me to face the unpredictable side of my own experience with the sky. I had to admit that some of my friends had died in crazy and unlucky ways, that some flights had gone uncontrollably wrong, and that perhaps not even the pilots were to blame. What is more, I had to admit that no matter how carefully I checked my own airplanes, and how cautiously I flew them, the same could happen to me.
That is where we stand now as a society with ValuJet Flight 592, and it may explain our continuing discomfort with the accident. The ValuJet case represents a nearly perfect system accident. It arose from a process that fits most of Perrow's technical requirements of unpredictability and interactive complexity and some of those of tight coupling. More important, it fits the most basic definitions of an accident caused by the very functioning of the system or industry within which it occurred. Flight 592 burned because of its cargo of oxygen generators, yes, but more fundamentally because of a tangle of confusions that will take some entirely different form next time. It is frustrating to fight such a thing, and wrongdoing is difficult to assign.
TAKE, for example, the case of the two SabreTech mechanics who helped to remove the oxygen canisters from the ValuJet MD-80s, ignored the written work orders to install safety caps, stacked the dangerous canisters improperly in cardboard boxes, and finished by falsely signing off on the job. They will probably suffer for the rest of their lives for their negligence, as perhaps they should. But here is what really happened: Nearly 600 people logged time working on the three ValuJet airplanes in SabreTech's Miami hangar, and of them seventy-two logged 910 hours over several weeks for replacing oxygen generators, in most cases because they had "expired"—reached the end of their approved lives. According to ValuJet work card No. 0069, which was supplied to investigators, the second step of the seven-step removal process was If generator has not been expended, install shipping cap on firing pin.
This required a gang of hard-pressed mechanics to draw a verbal distinction between canisters that were "expired," meaning most of the ones they were removing, and canisters that were not "expended," meaning many of the same ones, loaded and ready to fire, on which they were expected to put nonexistent caps. Also involved were canisters that were expired and expended, and others that were not expired but were expended. And then, of course, there was the set of new replacement canisters, which were both unexpended and unexpired. If this seems confusing, do not waste your time trying to figure it out—the SabreTech mechanics did not, nor should they have been expected to. The NTSB suggested that one problem at SabreTech's Miami facility may have been the presence of Spanish-speaking immigrants on the work force, but quite obviously the language problem lay on the other side—with ValuJet and the English-speaking engineers, literalists, who wrote the orders and technical manuals as if they were writing to themselves. The real problem, in other words, was engineerspeak.
Before the accident the worry was not about old parts but about new ones—the safe refurbishing of the MD-80s in time to meet the ValuJet deadline. The mechanics quickly removed the oxygen canisters from their brackets and wired green tags to most of them. The green tags meant "repairable," which these canisters were not. It is not clear how many of the seventy-two workers were aware that these canisters couldn't be used again, since the replacement of oxygen generators is a rare operation, though of the people questioned after the accident most claimed to have known at least why the canisters had to be removed. But here, too, there is evidence of confusion. After the accident two tagged canisters were found still lying in the SabreTech hangar. On one of the tags, under "Reason for Removal," someone had written, "out of date." On the other tag someone had written, "generators have been expired fired."
Yes, a mechanic might have found his way past the ValuJet work card and into the huge MD-80 maintenance manual, to chapter 35-22-01, within which line "h" would have instructed him to "store or dispose of oxygen generator." By diligently pursuing his options, the mechanic could have found his way to a different part of the manual and learned that "all serviceable and unserviceable (unexpended) oxygen generators (canisters) are to be stored in an area that ensures that each unit is not exposed to high temperatures or possible damage." By pondering the implications of the parentheses he might have deduced that the "unexpended" canisters were also "unserviceable"canisters and that because he had no shipping cap, he should perhaps take such canisters to a safe area and "initiate" them, according to the procedures described in section 2.D. To initiate an oxygen generator is of course to fire it off, triggering the chemical reaction that produces oxygen and leaves a mildly toxic residue within the canister, which is then classified as hazardous waste. Section 2.D contains the admonition "An expended oxygen generator (canister) contains both barium oxide and asbestos fibers and must be disposed of in accordance with local regulatory compliances and using authorized procedures." No wonder the mechanics stuck the old generators in boxes.
The supervisors and inspectors failed miserably here, though after the accident they proved clever at ducking responsibility. At the least they should have supplied the required safety caps and verified that those caps were being used. If they had—despite all the other errors that were made—Flight 592 would not have burned. For larger reasons, too, their failure is an essential part of this story. It represents not the avarice of profit takers but rather something more insidious—the sort of collective relaxation of technical standards that the Boston College sociologist Diane Vaughan has called "the normalization of deviance," and that she believes existed at NASA in the years leading up to the 1986 explosion of the space shuttle Challenger. The leaking O-rings that caused the catastrophic blow-by of rocket fuel were a well-known design weakness, and had been the subject of worried memos and conferences up to the eve of the launch. Vaughan's book The Challenger Launch Decision (1996) is a 575-page exercise in system-accident thinking. After a long immersion in NASA's technical culture, Vaughan concludes that the O-ring worries were put aside in part because the agency had gotten away with launching the O-rings before. As Perrow has argued, what can go wrong usually goes right—and then people draw the wrong conclusions. In a general way this is what happened at SabreTech. Some mechanics now claim to have expressed their concerns about the safety caps, but if they did, they were not heard. The operation had grown used to taking shortcuts.
But let us be honest—mechanics who are too careful will never get the job done. The airline system as it stands today requires people, in flight or on the ground, to compromise, to make choices, and sometimes even to gamble. The SabreTech crews went astray—but not far astray—by allowing themselves quite naturally not to worry about discarded parts. A fire hazard? Sure. The mechanics taped off the lanyards and may have shoved the canisters a little farther away from the airplanes they were working on. The canisters had no warnings about heat on them and none of the standard hazardous-materials placards. It probably would not have mattered anyway, because the work area was crowded with placards and officially designated hazardous materials, and people had learned not to take them too seriously. Out of curiosity a few of the mechanics fired off some canisters and listened to the oxygen come out—it went pssst. No one seems to have considered the possibility that the canisters might accidentally be shipped. The mechanics did finally carry the five cardboard boxes over to the shipping department, but only because that was where ValuJet property was stored—an arrangement that itself made sense.
When the shipping clerk got to work the next morning, he found the boxes without explanation on the floor of the ValuJet area. The boxes were innocent-looking, and he left them alone until he was told to tidy up. Sending them to Atlanta seemed like the best way to do that. He had shipped off "company material" before without ValuJet's specific approval, and he had heard no complaints. He knew he was dealing with oxygen canisters, but apparently did not understand the difference between oxygen storage tanks and generators designed to fire off. When he prepared the boxes for shipping, he noticed the green "repairable" tags mistakenly placed on the canisters by the mechanics, and misunderstood them to signify "unserviceable" or "out of service," as he variably said after the accident. He also drew the unpredictable conclusion that the canisters were therefore empty. He asked the receiving clerk to fill out a shipping ticket. The receiving clerk did as he was asked, listing the tires and canisters, and put quotation marks around the word "Empty." Later, when asked why, he replied, "No reason. I always put like, when I put my check, I put 'Carlos' in quotations. No reason I put that." The reason was that it was his habit. On the shipping ticket he also put "5 boxes" between quotation marks.
But a day or so later, over by Flight 592, the ValuJet ramp agent who signed for the cargo didn't care about such subtleties. ValuJet was not authorized to carry hazardous cargoes of any sort, and it seems obvious now that a shipping ticket listing tires on wheel assemblies and oxygen canisters (whether or not they were empty) should have aroused the ramp agent's suspicions. No one would have complained had he opened the boxes, or summarily rejected the load. There was no hazardous-materials paperwork associated with it, but he had been formally trained in the recognition of unmarked hazards. His ValuJet station-operations manual specifically warned, "Cargo may be declared under a general description that may have hazards which are not apparent, that the shipper may not be aware of this. You must be conscious of the fact that these items have caused serious incidents, and in fact, endangered the safety of the aircraft and personnel involved." It also said,
Your responsibility in recognizing hazardous materials is dependent on your ability to: 1. Be Alert! 2. Take the time to ask questions! 3. Look for labels! ... Ramp agents should be alert whenever handling luggage or boxes. Any item that might be considered hazardous should be brought to the attention of your supervisor or pilot, and brought to the immediate attention of Flight Control and, if required, the FAA. REMEMBER: SAFETY OF PASSENGERS AND FELLOW EMPLOYEES DEPENDS ON YOU!
It is possible that the ramp agent was lulled by the company-material labels. Would the SabreTech workers ship hazardous cargo without letting him know? His conversation with the copilot, Richard Hazen, about the weight of the load may have lulled him as well. Hazen, too, had been formally trained to spot hazardous materials, and he would have understood better than the ramp agent the dangerous nature of oxygen canisters, but he said nothing. It was a routine moment in a routine day. The morning's pesky electrical problems had perhaps been resolved. The crew was calmly and rationally preparing the airplane for the next flight, a procedure that had always worked for them before. As a result the passengers' last line of defense folded. They were unlucky, and the system killed them.

WHAT are we to make of this tangle of circumstance and error? One suspicion is that its causes may lie in the market forces of a deregulated airline industry, and that in order to keep such catastrophes from happening in the future we might need to consider the possibility of re-regulation—a return to the old system of limited competition, union work forces, higher salaries, and expensive tickets. There are calls now for just that. The improvement in safety would come from slowing things down, and allowing a few anointed airlines the leisure to discover their mistakes and act on them. The effects on society, however, would be costly and anti-egalitarian—a return to a constricted system that many fewer people could afford to use. Moreover, technical trends would argue against it. Despite the obvious chaos of the business and the apparent frequency of airline accidents, air travel has become safer under deregulation. Reductions in "procedural" and "engineered" accidents have more than compensated for any increase in system accidents—which in any case must have occurred in the past as well.
The other way to regulate the airline industry is not economic but operational—detailed governmental oversight of all the technical aspects of flight. This is an approach we have taken since the birth of the airlines, in the 1920s, and it is what we expect of the FAA today. Strictly applied standards are all the more important in a free market, in which unchecked competition would eventually require airlines to cut costs to the point of operating unsafely, until accidents forced them out of business one by one. A company should not overload its airplanes or fly them with worn-out parts, but it also cannot compete effectively against other companies that do. Day to day, airline executives may resent the intrusion of government, but in their more reflective moments they must also realize that they need this regulation in order to survive. The friendship that has grown up between the two sides—between the regulators and the regulated—is an expression of this fact, which no amount of self-reform at the FAA can change. When after the ValuJet crash David Hinson, of the FAA, reacted to accusations of cronyism by going to Congress and humbly requesting that his agency's "dual mandate" be eliminated, so that it would no longer be required by law to promote the airlines, he and Congress (which did as he requested) were engaged in a particularly hollow form of political theater.
The FAA's critics had real points to make. The agency had become too worried about the reactions of its allies in the airline industry, and it needed to try harder to enforce existing regulations. Perhaps it needed even to write some new regulations. Like NASA before the Challenger accident, the FAAneeded to listen to the opinions and worries of its own lower-level employees. But there are limits to all this, too. When, at a post-crash press conference in Miami, a reporter asked Robert Francis, of the NTSB, "Shouldn't the government protect us against this kind of thing?" the best answer would have been "It cannot, and never will."
The truth helps, because in our frustration with such system accidents we may be tempted to invent solutions that, by adding to the obscurity and complexity of the system, may aggravate just those characteristics that led to the accidents in the first place. This argument for a theoretical point of diminishing safety is a central part of Perrow's thinking, and it seems to be borne out in practice. In his exploration of the North American early-warning system Sagan found that the failures of safety devices and backup systems gave the most dangerous false indications of missile attack—the kind that could have triggered a response. The radiation accidents at Chernobyl and Three Mile Island were both induced by failures in the safety systems. Remember also that the ValuJet oxygen generators were safety devices, that they were backup systems, and that they were removed from the MD-80s because of regulations limiting their useful lives. This is not an argument against such devices but a reminder that elaboration comes at a price.
Human reactions add to the problem. Administrators can think up impressive chains of command and control, and impose complex double checks and procedures on an operating system, and they can load the structure with redundancies, but on the receiving end there comes a point—in the privacy of a hangar or a cockpit—beyond which people rebel. These rebellions are now common throughout the airline business—and, indeed, throughout society. They result in unpredictable and arbitrary actions, all the more so because in the modern, insecure workplace they remain undeclared. The one thing that always gets done is the required paperwork.
Paperwork is a necessary and inevitable part of the system, but it, too, introduces dangers. The problem is not just the burden that it places on practical operations but also the deception that it breeds. The two unfortunate mechanics who signed off on the nonexistent safety caps just happened to be the slowest to slip away when the supervisors needed signatures. The other mechanics almost certainly would have signed too, as did the inspectors. Their good old-fashioned pencil-whipping is perhaps the most widespread form of Vaughan's "normalization of deviance." The falsification they committed was part of a larger deception—the creation of an entire pretend reality that includes unworkable chains of command, unlearnable training programs, unreadable manuals, and the fiction of regulations, checks, and controls. Such pretend realities extend even into the most self-consciously progressive large organizations, with their attempts to formalize informality, to deregulate the workplace, to share profits and responsibilities, to respect the integrity and initiative of the individual. The systems work in principle, and usually in practice as well, but the two may have little to do with each other. Paperwork floats free of the ground and obscures the murky workplaces where, in the confusion of real life, system accidents are born.
It would be wrong to conclude that we should join the alarmists in their prophesies of doom. Flying will remain safe, and for conventional reasons, including the admirable reaction we have seen to the ValuJet crash. But it should also be clear that there are structural limits to flight safety, and that any dream of a zero-accident future is probably about as realistic as the old ValuJet promise to put safety first. If that is true, we had better get used to it. Conventional accidents—those I call procedural or engineered—will submit to our solutions, but as air travel continues to expand, we can expect capricious system accidents to blossom. Understanding why might keep us from making the system even more complex, and therefore perhaps more dangerous, too.