Computer science lingo, on its way to becoming mainstream, has a way of picking up legendary origin stories.
Consider, for instance, the tale of the software “bug.” The most popular etymological backstory isn’t exactly accurate, but that hasn’t stopped people from retelling it. The term emerged, the story goes, when the pioneering computer scientist Grace Hopper discovered a moth—an actual bug—trapped between two components of the enormous Mark II machine she was working on at Harvard in 1947.
The insect was removed and taped into a log book with a little note: “First actual case of bug being found.” All this really happened—the log book is in a collection at the National Museum of American History!—but it doesn’t represent the first time “bug” was used to mean a glitch or flaw in a program. In fact, as the museum points out on its website, engineers were griping about bugs as early as the 1870s, when Thomas Edison complained of them in his work on electrical circuits.
The ill-fated moth’s story is irresistible, which makes it a good example of why tracing linguistic roots is such tricky business—even, or perhaps especially, for relatively new words.
“The vocabulary of computing can be baffling, and just when you have finally figured out the difference between a mainframe and a mini, they're almost obsolete,” wrote Peter H. Lewis in a column for The New York Times back in 1993. In it, Lewis defined a brief list of terms from corporate computing culture, words and phrases like “server,” “open systems,” and “outsourcing.” He also listed a few words and phrases that had trickled into the mainstream, like “hard-wired,” “beta,” and “bandwidth.” And then, there was “kludge:”
KLUDGE, pronounced klooj, is an inelegant but expedient solution to a problem, or a solution done hastily that will eventually fail. Examples: “We kludged it until we can figure out the right way to do it.”
The pronunciation of “kludge”—it rhymes with subterfuge and ice luge, not nudge and fudge—hints at its alternate spelling, “kluge,” which is still commonly used among programmers. The discrepancy may also offer hints as to the earliest uses of the word, which, like “bug,” has its own thicket of folklore to untangle.
“It’s, um, complicated,” the linguist and lexicographer Ben Zimmer told me in an email. “The short answer is that the word was originally spelled ‘kluge’—derived from the surname Kluge, in turn from German klug, ‘clever.’ But then later it began to be spelled as ‘kludge,’ merging with a U.K. slang term with that spelling (apparently derived from a Scots word for ‘toilet’). So now we often get the ‘kludge’ spelling with the ‘kluge’ pronunciation.”
Several sources trace the word’s origins back to 1940s military usage, where it was apparently used in the Navy to describe electronic equipment that “worked well on shore but consistently failed at sea,” according to the Jargon File, a compendium of hacker slang created by the developer Eric S. Raymond. But there are other hints that “kluge,” dates back farther, perhaps as a reference to printing press equipment manufactured by Brandtjen & Kluge in the 1930s.
Newspaper ads from that decade describe Brandtjen & Kluge systems as modern marvels—automatic and ultra-fast—but they also had a reputation for being, well, pretty klugey, according to the Raymond’s site: “temperamental, subject to frequent breakdowns, and devilishly difficult to repair.”
“The result of this history is a tangle,” the Jargon File concludes. “Some observers consider this mess appropriate in view of the word’s meaning.”
But ironically, given its definition, kluge is itself a fantastically nuanced word, too. Here’s how the File describes sophisticated shades of its meaning:
Take the distinction between a kluge and an elegant solution, and the differing connotations attached to each. The distinction is not only of engineering significance; it reaches right back into the nature of the generative processes in program design and asserts something important about two different kinds of relationship[s] between the hacker and the hack. Hacker slang is unusually rich in implications of this kind, of overtones and undertones that illuminate the hackish psyche.
The File also points out that kluge exists on a spectrum of related slang that might be used to describe the functionality (and beauty) of code more broadly, from “monstrosity” to “perfection.”(Perfection in computer programming, of course, being a “mythical absolute, approximated but never actually attained.”)
Whatever its origins, kluge—or kludge—is a term that’s useful enough to merit wider usage. There are, after all, kluges everywhere in the technological systems that increasingly undergird (and complicate) modern life. Samuel Arbesman has an entire chapter devoted to the kluge in his new book, Overcomplicated, which examines how technological infrastructure has become convoluted beyond even the possibility of human comprehension. (A prime example of this klugery: computer outages that ground entire airlines.)
But the kluges in our midst go beyond the intricacy of massive patched-together systems that eventually fail for reasons people can’t every fully parse (or, for that matter, fix). Consider the kluged-up American legal code, which is now “more than 22 million words long, with more than 80,000 connections between one section and another.” Or the fact that tax regulations have become so intricate and confusing that the Supreme Court has ruled that you can’t be convicted for willful failure to file tax returns when you’ve made a good-faith error in your filing, as Arbesman points out in his book.
“Essentially, it is more efficient for the law to make these klugey patches on the overcomplicated tax code than to overhaul it entirely from scratch to make it more user-friendly,” he writes.
Arbesman also quotes Stewart Brand, the creator of the Whole Earth Catalog, who describes the kluginess of 21st-century life thusly: “Typically, outdated legacy systems make themselves so essential over the years that no one can contemplate the prolonged trauma of replacing them, and they cannot be fixed completely because the problems are too complexly embedded and there is no one left who understand the whole system.”