The Computer and the Economy
Will information technology ever produce the productivity gains that were predicted?
The arrival of the new, computerized economy is regularly heralded -- one might even say hyped -- in the business press. Evidence for "productivity miracles" arising from the computer and from information technology (IT) in general appears to be all around us. Modern steel mills run virtually without labor. The New York Stock Exchange handles electronically a volume of transactions that was inconceivable in the pre-computer age. Businesses nowadays can compute and communicate far faster than they could, say, a decade or two ago. The improvements have indeed been prodigious. For example, if over the past thirty years or so automobile efficiency had increased as dramatically as computer efficiency has in some respects, you would now be able to drive your car coast to coast on about four milliliters of gasoline.
Computerization has also revolutionized (some) factory floors and the inventory-management practices of numerous companies. Some businesses now serve their customers with automated devices rather than human beings -- ATMs, voice mail, and Web sites are common examples. And, of course, many new products and some entirely new industries (for example, Internet access) have been created.
But is it a vastly more productive world -- in the narrow sense of producing more gross domestic product per hour of labor? The official data say no. As the government measures it, productivity growth has not accelerated since the information revolution got going. In fact, except in manufacturing, it has decelerated. And productivity performance has been downright dreadful in some of the areas in which innovations in IT might have been expected to yield the most dramatic dividends -- such as the financial sector. What's going on here? How can we reconcile the dismal productivity numbers with the apparently wondrous developments in IT?
Our hypothesis is that both adjectives are wrong: productivity performance is not quite so dismal as the official numbers suggest, and developments in IT are not quite so wondrous.
To be sure, part of the problem is that we are mismeasuring productivity. A corollary of the well-known argument that standard price indexes overstate inflation is that standard quantity measures -- real GDP, for example -- understate production. Since labor input is measured accurately, any underestimate of output translates directly into an underestimate of labor productivity (output per hour). The data insist that the U.S. economy has experienced essentially no increase in total-factor productivity (the ratio of output to all inputs, not just labor) for fifteen to twenty years. That is simply not believable. Furthermore, the fact that productivity performance has been particularly dismal in information-intensive industries where output is hard to measure hints at measurement error.
Although mismeasurement is surely part of the story, it is not our main concern here. The claim that the IT revolution has boosted productivity enormously is, we believe, based on misunderstanding, hype, and an untested prediction about the future rather than a factual statement about the past. Here are ten reasons for questioning the productivity bounty from IT.
1. It's less than you think. Fascinating as they are, computers and information technology are but a small piece of our vast economy. True, investment in computing and related equipment is the fastest-growing segment of business fixed investment. But last year it still accounted for less than 10 percent of gross investment. This may be the information age, but American industry still needs factories and office buildings, trucks and airplanes, drill presses and stamping machines, and a myriad of other old-fashioned -- and expensive -- investment items. Much of the industrial world is still physical, not virtual.
It is hard to see how a mere 10 percent of investment could revolutionize economy-wide productivity -- although it could well have dramatic effects in some sectors. Indeed, Daniel Sichel, an economist at the Federal Reserve, estimates that investment in computer hardware accounted for only 0.2 of the total 2.3 percent average annual growth rate of nonfarm business output from 1980 to 1992, and even less in the preceding years.
Furthermore, as is well known, computer technology grows obsolete with amazing celerity (more on this later), so the share of IT in net investment -- that is, after depreciation -- is even less than its share in gross investment. And, of course, it is net investment that augments the stock of productive capital. In this respect the new world of information technology is a lot like Alice's Wonderland: you have to run pretty fast just to stand still.
Finally, as every boat owner knows, equipping a boat with an engine that is twice as powerful as the original one will not make the boat go twice as fast. In fact, if the engine tries to force the boat to go faster than its "hull speed," the craft may lower its nose and drive itself underwater. Similarly, it is fallacious to think that if the efficiency of computers doubles (or rises a thousandfold), the whole set of industrial inputs should therefore become twice as efficient.
2. It's not as new as you think. The Internet, satellite communications, and cellular telephones are technological marvels that have not only speeded up communications but also made possible entirely new forms of information transfer. But they hardly constitute the first steps in this direction, nor are they necessarily the biggest. The invention of the telegraph, in the middle of the nineteenth century, allowed messages from New York to Chicago to be delivered more than 3,000 times as fast as before. The laying of the transatlantic cable in 1866 created a like improvement in communication speed between New York and London. And who really thinks that any of the flashy modern innovations in communications approaches the productivity impact of the telephone?
Nor did office and factory automation wait for the computer age. The typewriter had already improved so much by 1900 that typing was three times as fast as handwriting. Adding machines became available in the 1880s, as did Herman Hollerith's original punch-card machine. Henry Ford's innovative assembly lines represented a quantum leap in productivity -- but they were not driven by a computer.
Our point is not to denigrate recent technological achievements. They are stupendous. But as these inventions dazzle us, it is easy to forget that many of the innovations that have contributed the most to industrial productivity came long ago.
Our next four points pertain to the growing pains that are experienced when technology advances faster than our capacity to absorb it -- as information technology surely has.
3. We are always in the learning mode. New products appear constantly. We are often enchanted by the possibilities they offer but unable to exploit them without investing considerable time and effort in learning. When IBM mainframes were the dominant kind of computer, all you had to learn was the operating system JCL and your favorite applications software, and you were set. Now there are hundreds or even thousands of hardware providers, and probably hundreds of thousands of software providers -- each with different approaches and protocols.
Standardization has gone out the window. Installing a software package on one computer is not necessarily the same as installing it on another. Installing it in the presence of some other kind of software is not the same as installing it without that software. Device-driver conflicts are legion. Sometimes even the vendor's technical-support people have a hard time accomplishing the customer's objectives. All this consumes endless amounts of time and resources, thereby diminishing productivity.
4. Obsolescence occurs quickly. No sooner have you gotten used to WordPerfect 5.0, with a manual of about 500 pages, than WordPerfect 5.1 appears -- with a manual of about 1,000 pages. No sooner are you on speaking terms with that than WordPerfect 6.0 appears, and so on. Unlike much industrial equipment, software is easily rendered obsolete. And since software products are not perishable, the only way a software provider can grow significantly is by regularly inducing customers to buy updates. This turnover adds to the learning problems mentioned above, often without much noticeable improvement in the product -- except, perhaps, for the elimination of bugs in the previous version, which are duly replaced by different bugs in the new version.
5. Fragmentation and lack of quality control. Flowers in the software industry have bloomed prolifically. Once upon a time there were a few large vendors; now thousands and thousands of small vendors are in business. Imagination and initiative have soared in consequence. But quality may have been compromised -- not necessarily by neglect but perhaps because matters have become too complex for anyone to fathom fully. Thus, for example, one high-level computer language works with DOS and Windows but not with Windows95. A certain well-known Fortran compiler for Windows95 works flawlessly -- but if you try to execute a program it has compiled, it will tell you that it may not execute correctly in DOS mode and will ask whether you want it to produce a proper DOS version. If you reply no and execute anyway, it works; if you say yes, it bombs. And it would be interesting to know how many person-hours and billions of dollars will have been spent by December 31, 1999, on fixing the "year 2000 problem" that affects a vast number of computer programs.
6. Interdependence can be hazardous. Not long ago most of us worked on freestanding computers; now virtually everything is networked. This development has been mostly to the good. It has paved the way, for example, for enormous advances in information transfer and processing. But being linked to seemingly everyone by far fewer than six degrees of separation has its dark side.
A decade ago a clever hacker prepared a computer "worm" in the form of a Christmas greeting, which he sent by E-mail over IBM's worldwide message network. Recipients, upon opening the mail, would see a cute little Christmas tree displayed on the screen. Unbeknownst to them, however, the E-mail also interrogated their E-mail address books. It then duplicated itself the requisite number of times and sent itself to each address. At these new destinations it repeated the process. Within a few hours the network was so badly overloaded that it was almost brought to a halt.
Three additional reasons pertain to economists' bread-and-butter concern: the efficient use of resources.
7. Inappropriate pricing. Information is valuable; thus it makes sense for the creators of databases and the providers of access to databases to charge for their use. But the marginal cost of providing access to a database is very close to zero; hence the socially optimal price charged for such access should also be very close to zero. But at such a low price the initial investment cannot be recouped; even operating costs may not be recoverable. Hence database access is typically sold at prices higher than marginal cost, which introduces monopoly elements into the industry and makes the use of information resources less intensive than it would be under competitive pricing.
In the case of the Internet, even getting an effective pricing system up and running has proved daunting. But here an opposite problem has arisen: the price charged for Internet access is often zero, so the resource is grotesquely overused. The result is that congestion on the information superhighway at rush hours rivals congestion on conventional highways. If things don't improve, Yogi Berra's aphorism may soon apply: The place is so crowded that nobody goes there any more.
8. Games people play. Many of the resources made available by information technology provide amusement but have no visible impact on productivity. In fact, some of the fun may come at the expense of productivity. Aimless surfing of the Internet, social exchange over chat lines, membership in endless listservs, and computer games are all fun, but they probably reduce rather than enhance business productivity. And we have not even mentioned junk E-mail.
9. Information overload. Improvements in information technology are, of course, designed to get more and more information to more and more people more and more rapidly. To a truly remarkable extent the information revolution has achieved this goal -- and progress continues unabated. Now ask yourself a question: How often do you feel that the sheer quantity -- not the quality, mind you, but the quantity -- of information reaching you is insufficient?
Speaking for ourselves, we receive by conventional (paper) means alone far more information than we can possibly process. Often electronic transmissions just add to the glut. Computers may work a million times as fast as they did a (human) generation ago, but the information-processing capabilities of the human brain have undergone no such technological revolution. In truth, most people now spend a significant amount of time just sifting through the information that bombards them each day.
Search engines are an apt modern metaphor for the information-overload problem. How many times have you searched for a combination of words, only to be told that your machine has found 19,468 matches for you to inspect?
Finally, we return to the point made earlier about the likely mismeasurement of productivity. In the case of IT this point needs at least one major qualification.
10. Yes, it's unmeasured, but is it productivity? We grant that the government's data-collection systems are probably too antiquated to capture many of the gains derived from IT. People are better off for being able to bank after hours at an ATM, or to obtain travel information after midnight on the Internet. Stock markets handle with ease far more transactions than they once did. Researchers, lawyers, journalists, and others can now search enormous archives and files of newspapers and magazines quickly and efficiently. The list could go on and on. Such gains are real, though some go unmeasured.
But in other cases "productivity gains" are ephemeral or even chimerical. Our own industry -- college teaching -- offers many such examples. Since students started to submit term papers written with word processors, the appearance of the papers has greatly improved. Lines of text are justified, spell-checkers catch most spelling errors, footnotes fit neatly on the page, and so on. But the thinking has not improved, and the quality of the research has sometimes deteriorated. For example, we know of one college that now requires that term papers contain references to at least some books available in the college library, because students find it so easy to track down facts on the Internet that term papers have increasingly relied exclusively on Internet references. And casual empiricism suggests that both grammar and spelling in E-mail are atrocious.
Easy access to computers and computer power may have adverse effects even on the work of otherwise careful and thoughtful researchers. Computer programs designed to solve scientific problems normally have to be debugged first -- that is, put through a wringer to discover inadvertent logical errors. In the old days, when researchers used to get only small slices of computer time at sporadic intervals (often late at night), they thought very hard about how to fix their programs -- lest the debugging task take weeks. Nowadays hundreds of passes can be made in a day, and as a result computer users may well substitute computer power for brainpower. It is most unlikely that gains in research productivity -- measured in, say, problems solved per day -- have come even close to those in computing technology.
IF it is true, as we believe, that advances in IT have yielded only small gains in productivity up to now, will the future be like the past? Or are we about to enter a new realm of IT-generated productivity?
One school of thought holds that the growing pains we have just discussed will soon give way to enormous gains in productivity as the transition to the information age is completed and new technologies diffuse throughout society. According to this view, we have so far seen only the least-productive tip of the iceberg. The best is yet to come.
Many traditional service jobs will disappear, but new ones will be created. We may have fewer bookkeepers but more data-entry clerks. We will have much less reason to leave our homes for shopping, learning, and discussions with friends and colleagues. There will be profound long-term effects on the publishing industry as we know it. And the quality of education at all levels might be expected to improve. There will be obvious and beneficial impacts on the computing and telecommunications industries.
Many other marvels may be in store for us. For example, the coming decades may see a great deal of consolidation and standardization in IT, which will continue to reduce the costs of using it. A group of professionals, perhaps called "customizing specialists," will probably come into being to solve some of the problems we noted earlier. Furthermore, much of the power of IT may be seen in pure research, whose effects on productivity are extremely long-term.
This optimistic view may well be accurate. But even if it is, our highly productive future may be a long way off. In the meantime, we may be condemned to a lengthy and uncomfortable transition period. Indeed, if technology continues to evolve as quickly in the future as it has in, say, the past three decades (and who is to say it will not?), our human adaptive capabilities may lag further and further behind the new machines.
And, again, there is the nasty problem of information overload. We must entertain the possibility that many people and businesses have already passed the point of positive net returns to information -- net, that is, after processing costs. How many of us really think that our business or professional lives are seriously impeded by a lack of information? More and more information may simply make us less and less able to digest and process the information that is readily at hand.
Ironically, the most profound benefits of information technology may be found not in the economic arena at all but in the political sphere. We have already seen, in the former Soviet Union and its satellites, the salutary effects of a free flow of information on repressive, authoritarian regimes. Such systems can survive only if small groups of people prevent others from understanding and sharing in the decision-making that affects their lives. The channels of control required to maintain an authoritarian system are vertical. Someone at the top decides what should happen, and this information or decision is passed down a chain that eventually reaches the citizens affected. Horizontal cooperation among citizens -- a hallmark of civil society and hence of democracy -- must be discouraged, as it effectively was in the Soviet bloc.
Although information technology can be used for purposes antagonistic to democracy, such as snooping and surveillance, we believe that it is on the whole clearly the enemy of authoritarian systems. The ease with which individuals can browse in publicly accessible information sources, exchange private messages, or log into remote computers makes the flow of information unhindered, free, and vast. When schoolchildren in one country can routinely chat on the Internet with their counterparts in another, when newsletters are posted on listservs, official falsehoods will not long prevail.
This important point has been perceived by the philanthropist George Soros. One objective of his support of worthy causes in Central and Eastern Europe and the former Soviet Union is to spread the use and culture of the Internet, because information technology promotes the growth of open societies. In the end, the primary payoff from advances in information technology may be not in new and better goods and services but in new and better democracies.