The world press is marking the 35th anniversary of the bar code. So many people have grown up with it that it no longer appears to be the sinister innovation it once seemed; as my friend Jackson Lears remarked to the New York Times:
[W]ith the advent of Google Earth and global tracking devices, "it now seems comparatively innocuous."
The bar code "has almost acquired a certain antique appeal as an early expression of the sorting and categorizing impulse in computer-driven marketing and sales," he added.It seems, he said, "in some ways a charmingly archaic icon."
As the Times piece points out, the bar code was supposed to be a transitional technology, to be replaced by directly read numbers. Then it was thought doomed by the development of tiny embedded microchips that would identify products by broadcasting signals -- radio frequency identification (RFID). But for reasons of economics and human factors (imagine having to aim a scanner precisely at a line of numbers), the robust bar code has held its own.
And there's even more to the story. Bar codes have been around even longer than the Times article suggests -- the technology's principles are at least 60 years old. And the mathematical tricks that make it work have been around for at least 40 years; many of the brilliant people behind them are known only to specialists. These are computer programs that generate numbers with digits that have no meaning in themselves, added to double-check the accuracy of the real numbers. The book publishing industry was using a program developed by the Dutch mathematician Jacobus Verhoeff by the late 1960s, even before it adopted bar codes, which nearly all books now carry. Together, the bar code and the check digit became essential to early electronic commerce. Jeff Bezos, founder of Amazon.com, chose books to start his business partly because one city, Seattle, was rich in both experienced programmers and book distribution warehouses, but also because existing databases of scannable International Standard Book Numbers (ISBNs) were available on CD-ROM and could be uploaded. The Oracle software used by Bezos was also not so new; it originated in the 1970s was based on a theoretical breakthrough published in 1970 by the IBM research mathematician Edgar Codd. Wal-Mart as well as Amazon.com became a dominant retailer through its outstanding exploitation of the ISBN's later and broader counterpart, the Universal Product Code (UPC).
An unexpected side of this story, then, is how many of information technology's breakthroughs were made by 1975. Of the top ten computer algorithms (problem-solving procedures) of the twentieth century named by editors of the journal Computers in Science and Engineering a few years ago, eight were developed in the two decades between 1947 and 1965, followed by one in 1977 and one a decade later, with none in the 1990s. The PageRank algorithm that helped launch Google is missing, so this isn't an ultimate list. But the bar code story stills suggest a slower pace of recent technological change than we often suppose. What has the more recent two-dimensional bar code added for most people but the ability to create custom pet-based postage stamps?
Three sentiments are in order. One is deep respect for the theorists as well as the business people who moved before hardware was quite ready for their ideas. Another is relief that even Wal-Mart can't always impose its will on suppliers and customers; resistance is not necessarily futile. And the third is a question. Are there equally brilliant ideas germinating now, and if so, what has kept them from the marketplace?
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.