Will information technology ever produce the productivity gains that were predicted?

THE arrival of the new, computerized economy is regularly heralded -- one might even say hyped -- in the business press. Evidence for "productivity miracles" arising from the computer and from information technology (IT) in general appears to be all around us. Modern steel mills run virtually without labor. The New York Stock Exchange handles electronically a volume of transactions that was inconceivable in the pre-computer age. Businesses nowadays can compute and communicate far faster than they could, say, a decade or two ago. The improvements have indeed been prodigious. For example, if over the past thirty years or so automobile efficiency had increased as dramatically as computer efficiency has in some respects, you would now be able to drive your car coast to coast on about four milliliters of gasoline.

Computerization has also revolutionized (some) factory floors and the inventory-management practices of numerous companies. Some businesses now serve their customers with automated devices rather than human beings -- ATMs, voice mail, and Web sites are common examples. And, of course, many new products and some entirely new industries (for example, Internet access) have been created. 

But is it a vastly more 

productive world -- in the narrow sense of producing more gross domestic product per hour of labor? The official data say no. As the government measures it, productivity growth has not accelerated since the information revolution got going. In fact, except in manufacturing, it has decelerated. And productivity performance has been downright dreadful in some of the areas in which innovations in IT might have been expected to yield the most dramatic dividends -- such as the financial sector. What's going on here? How can we reconcile the dismal productivity numbers with the apparently wondrous developments in IT?

Our hypothesis is that both adjectives are wrong: productivity performance is not quite so dismal as the official numbers suggest, and developments in IT are not quite so wondrous.

To be sure, part of the problem is that we are mismeasuring productivity. A corollary of the well-known argument that standard price indexes overstate inflation is that standard quantity measures -- real GDP, for example -- understate production. Since labor input is measured accurately, any underestimate of output translates directly into an underestimate of labor productivity (output per hour). The data insist that the U.S. economy has experienced essentially no increase in total-factor productivity (the ratio of output to 

all inputs, not just labor) for fifteen to twenty years. That is simply not believable. Furthermore, the fact that productivity performance has been particularly dismal in information-intensive industries where output is hard to measure hints at measurement error.

Although mismeasurement is surely part of the story, it is not our main concern here. The claim that the IT revolution has boosted productivity enormously is, we believe, based on misunderstanding, hype, and an untested prediction about the future rather than a factual statement about the past. Here are ten reasons for questioning the productivity bounty from IT.


It's less than you think. Fascinating as they are, computers and information technology are but a small piece of our vast economy. True, investment in computing and related equipment is the fastest-growing segment of business fixed investment. But last year it still accounted for less than 10 percent of gross investment. This may be the information age, but American industry still needs factories and office buildings, trucks and airplanes, drill presses and stamping machines, and a myriad of other old-fashioned -- and expensive -- investment items. Much of the industrial world is still physical, not virtual.

It is hard to see how a mere 10 percent of investment could revolutionize economy-wide productivity -- although it could well have dramatic effects in some sectors. Indeed, Daniel Sichel, an economist at the Federal Reserve, estimates that investment in computer hardware accounted for only 0.2 of the total 2.3 percent average annual growth rate of nonfarm business output from 1980 to 1992, and even less in the preceding years.

Furthermore, as is well known, computer technology grows obsolete with amazing celerity (more on this later), so the share of IT in net investment -- that is, after depreciation -- is even less than its share in gross investment. And, of course, it is net investment that augments the stock of productive capital. In this respect the new world of information technology is a lot like Alice's Wonderland: you have to run pretty fast just to stand still.

Finally, as every boat owner knows, equipping a boat with an engine that is twice as powerful as the original one will not make the boat go twice as fast. In fact, if the engine tries to force the boat to go faster than its "hull speed," the craft may lower its nose and drive itself underwater. Similarly, it is fallacious to think that if the efficiency of computers doubles (or rises a thousandfold), the whole set of industrial inputs should therefore become twice as efficient.


It's not as new as you think. The Internet, satellite communications, and cellular telephones are technological marvels that have not only speeded up communications but also made possible entirely new forms of information transfer. But they hardly constitute the first steps in this direction, nor are they necessarily the biggest. The invention of the telegraph, in the middle of the nineteenth century, allowed messages from New York to Chicago to be delivered more than 3,000 times as fast as before. The laying of the transatlantic cable in 1866 created a like improvement in communication speed between New York and London. And who really thinks that any of the flashy modern innovations in communications approaches the productivity impact of the telephone?

Presented by

Alan S. Blinder is a professor of economics and public affairs at Princeton and former vice chairman of the Federal Reserve.

How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well. Bestselling author Mark Bittman teaches James Hamblin the recipe that everyone is Googling.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus


How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well.


Before Tinder, a Tree

Looking for your soulmate? Write a letter to the "Bridegroom's Oak" in Germany.


The Health Benefits of Going Outside

People spend too much time indoors. One solution: ecotherapy.


Where High Tech Meets the 1950s

Why did Green Bank, West Virginia, ban wireless signals? For science.


Yes, Quidditch Is Real

How J.K. Rowling's magical sport spread from Hogwarts to college campuses


Would You Live in a Treehouse?

A treehouse can be an ideal office space, vacation rental, and way of reconnecting with your youth.
More back issues, Sept 1995 to present.

Just In