The arrival of the new, computerized economy is regularly heralded -- one might even say hyped -- in the business press. Evidence for "productivity miracles" arising from the computer and from information technology (IT) in general appears to be all around us. Modern steel mills run virtually without labor. The New York Stock Exchange handles electronically a volume of transactions that was inconceivable in the pre-computer age. Businesses nowadays can compute and communicate far faster than they could, say, a decade or two ago. The improvements have indeed been prodigious. For example, if over the past thirty years or so automobile efficiency had increased as dramatically as computer efficiency has in some respects, you would now be able to drive your car coast to coast on about four milliliters of gasoline.
Computerization has also revolutionized (some) factory floors and the inventory-management practices of numerous companies. Some businesses now serve their customers with automated devices rather than human beings -- ATMs, voice mail, and Web sites are common examples. And, of course, many new products and some entirely new industries (for example, Internet access) have been created.
But is it a vastly more productive world -- in the narrow sense of producing more gross domestic product per hour of labor? The official data say no. As the government measures it, productivity growth has not accelerated since the information revolution got going. In fact, except in manufacturing, it has decelerated. And productivity performance has been downright dreadful in some of the areas in which innovations in IT might have been expected to yield the most dramatic dividends -- such as the financial sector. What's going on here? How can we reconcile the dismal productivity numbers with the apparently wondrous developments in IT?