Learning From the Costa Concordia Tragedy: Technology and Overconfidence

Over-reliance on technology combined with personal overconfidence can be a deadly mix

Over-reliance on technology combined with personal overconfidence can be a deadly mix


The devastating foundering and capsizing of the Costa Concordia is inexcusable. The captain has been arrested and accused of criminally abandoning the ship, and it's hard to see how he and his attorneys will explain away his exchanges with the Italian Coast Guard.

But there's more to the story. While his deviation from the ship's programmed course was unauthorized, he claims that electronic charts told him the depth was safe. Just another alibi?  A concerned prospective cruise passenger investigating his background would have discovered that he had a good diploma, 20 years of accident-free experience, and a favorable reputation among his peers, according to Italian newspaper accounts summarized by CNN. Thus he may fit into a pattern of professionals who take extra risks because of confidence in their skills. Dr. Leonard Evans, in his book Traffic Safety and the Driver, points to studies suggesting that racing drivers have more accidents than typical motorists, possibly because of their greater experience and confidence in their abilities, which protects them until it doesn't.

The Italian captain has claimed that instruments misled him into thinking his maneuver, though not cleared by the line, was safe. An online article in New Scientist explains:

Like aviation, seafaring is in the midst of major computerisation, with bridges in modern ships like Costa Concordia becoming "glass cockpits". The transnational maritime trade union Nautilus International says that the technology at the heart of this -- the Electronic Charts Display and Information System (ECDIS), which marries GPS and seabed sonar data in one screen -- can be a problem. First, it says that the data on seabed obstacles can be out of date; second, the system generates too many alarms that can lead mariners to ignore them. "The ECDIS screens are only as good as the data that goes into them," says Nautilus spokesman Andrew Limington. "And there are major problems with their user interfaces and ergonomics."

As the link in the paragraph above suggests, ECDIS only last October was hailed as a safety system that should be mandated.

But it wasn't the first safety fix to backfire. Fifty years ago, misreading the radar scale on the ship Stockholm helped create its "radar-assisted collision" (as risk analysts call it) with the Andrea Doria. The Stockholm's Third Officer, according to this PBS site,

thought that he was looking at radar data based on a 15-mile range scale, but it is now widely believed that his radar was mistakenly set for 5 miles. So, at this point in the evening, when he saw the Doria on his radar, he believed that she was 12 miles away when actually she was only four miles away.

In the mid-1980s, a software glitch in the Therac-25 medical  X-ray machine resulted in at least 5 deaths when it accidentally operated in its high-power rather than its low-power program. Mode errors are a major problem in avionics design, too. In fact even everyday devices like clocks and smartphones make it easy to mistake time of day for alarm time, and even to mix up sounds settings, as a hapless concertgoer recently discovered.

Learning from disasters is little consolation to survivors of the victims. Unfortunately, the greatest tragedies often come after a generation of apparently safe technology; no lives had been lost to iceberg-ship collisions before the Titanic, and size was considered a plus for safety. I've blogged about the cycle here.

Image: Reuters.