It could be the most significant economic puzzle of our time: As a declining share of economic growth goes to our workers, are people becoming less valuable to businesses? And why?
The mysterious and growing divide between the rich and the rest in just about every wealthy country on Earth, including the U.S., is really two mysteries wrapped in one. The first mystery is why real wage growth has sped up at the top and slowed down for everybody else. But the second, more recent, and more fascinating problem is why labor's share of the winnings in developed economies has been in decline. It's not just that middle-class wages are falling behind the rich. Overall wages are falling behind something else -- capital.
People are becoming less valuable to companies. Why?
Simply put, the world shrank. Two seemingly unrelated inventions -- the microprocessor and the shipping container -- conspired to create a global market for all assets, including people. A century of achievements in computing power and shipping ushered in an era of global trade so expansive that it completely disaggregated the process of doing business (especially in manufacturing), allowing firms to treat finished goods as a bundle of globally sourced components and services.
As more than a billion new workers flooded a global labor market open to multinational companies, capable workers became less scarce, and therefore, less valuable. In order to understand how all of this happened, we have to dig into recent economic history, exploring the rise of digital technology and global trade.
THE UNIVERSAL LANGUAGE
20th century mathematicians were probably aware of the potential for computers to disrupt the market for human labor. It is, however, extremely unlikely that anyone could have predicted the convoluted manner in which this potential was realized. The obvious candidate would have been artificial intelligence -- if machines were able to mimic or perhaps exceed human intelligence, scores of people would suddenly find themselves useless. Though the specter of true AI still hangs over the 21st century, the great technological disruption of the 20th century was the advent of digital technology. The power of digital technology to reliably store and reproduce information led to a sea change in the way human beings communicate, making worldwide communication cheap, reliable and instantaneous.
Although no single invention can take full credit for the current ubiquity of digital devices, the microprocessor is generally credited as the core driver of smaller, more powerful digital devices. Microprocessors are computational engines that temporarily store and operate on information, doing the physical work of computation, taking information as input, changing it, and transmitting the altered information as output. The surface of a microprocessor is covered in tiny electrical switches called transistors -- the workhorses of the microprocessor. The "speed" of a microprocessor depends on the number of transistors it contains. In the early days of computation, transistors were large and computers were slow.
Transistor technology improved rapidly, leading to microscopic transistors, and smaller yet vastly more powerful microprocessors. This greater power expanded the scope of tasks that computers could perform, placing computers at the heart of a variety of tasks previously performed by human beings, including, ironically, the design and manufacture of microprocessors. In addition to growing to control complex manufacturing processes, modern computers are able to store, process, and reproduce complex forms of sensory information, such as sounds, and moving images.
This dramatic reduction in scale, coupled with a new ability to realistically reproduce sensory information, transformed the computer from a tool of industry and academics into a consumer item. So while the notion of a "bit" was the fruit of a mathematical inquiry, its ubiquity is perhaps better understood as the product of market forces -- the demand for a common unit of information. This demand from consumers and businesses alike for the capacity to store and transmit bits prompted the development of an omnipresent network of payments, phone calls, and electronic messages that fundamentally changed the way human beings conduct business and go about their daily lives. Suddenly, a single, global network could transmit payment information, photographs, scientific data, and current events, all on a nearly real time basis, creating geographically independent access to an ocean of transactions, human knowledge, culture, and experience.
Just as computers led to the proliferation of a common unit of information, world trade has been transformed by a common unit of shipping capacity -- the container.
Prior to containerized shipping, each piece of cargo had to be individually loaded onto and unloaded from shipping vessels using manual labor, forcing ships to spend substantial amounts of time idle in port. Every second a ship spends idle in port is a second the ship is not doing what ships are meant to do, which is shipping goods and generating revenue for the ship's owner. The key to containerized shipping is that goods are packed once at the point of production in a standardized "box." That box is left packed until it reaches its final destination. This allows containerized goods to be transferred from ship to rail to road and back, all without much manual labor. Eliminating the lag caused by manually packing and unpacking cargo, containerization dramatically reduced the amount of time it takes to load and unload ships, leading to a spike in the amount of time ships spend at sea, from around 50% of the ship's life to around 90%. Containerized volumes grew rapidly since the 1980s, at about 10% per year -- three-times faster than total seaborne trade.