Why Workers Are Losing the War Against Machines

More

In the 21st century war of man vs. machine in the workplace, what if man isn't supposed to prevail?

615 robot reuters 1.jpg

At least since the followers of Ned Ludd smashed mechanized looms in 1811, workers have worried about automation destroying jobs. Economists have reassured them that new jobs would be created even as old ones were eliminated. For over 200 years, the economists were right. Despite massive automation of millions of jobs, more Americans had jobs at the end of each decade up through the end of the 20th century. However, this empirical fact conceals a dirty secret. There is no economic law that says that everyone, or even most people, automatically benefit from technological progress.

People with little economics training intuitively grasp this point. They understand that some human workers may lose out in the race against the machine. Ironically, the best-educated economists are often the most resistant to this idea, as the standard models of economic growth implicitly assume that economic growth benefits all residents of a country. However, just as Nobel Prize-winning economist Paul Samuelson showed that outsourcing and offshoring do not necessarily increase the welfare of all workers, it is also true that technological progress is not a rising tide that automatically raises all incomes. Even as overall wealth increases, there can be, and usually will be, winners and losers. And the losers are not necessarily some small segment of the labor force like buggy whip manufacturers. In principle, they can be a majority or even 90% or more of the population.

If wages can freely adjust, then the losers keep their jobs in exchange for accepting ever-lower compensation as technology continues to improve. But there's a limit to this adjustment. Shortly after the Luddites began smashing the machinery that they thought threatened their jobs, the economist David Ricardo, who initially thought that advances in technology would benefit all, developed an abstract model that showed the possibility of technological unemployment. The basic idea was that at some point, the equilibrium wages for workers might fall below the level needed for subsistence. A rational human would see no point in taking a job at a wage that low, so the worker would go unemployed and the work would be done by a machine instead.

Of course, this was only an abstract model. But in his book A Farewell to Alms, economist Gregory Clark gives an eerie real-world example of this phenomenon in action:

There was a type of employee at the beginning of the Industrial Revolution whose job and livelihood largely vanished in the early twentieth century. This was the horse. The population of working horses actually peaked in England long after the Industrial Revolution, in 1901, when 3.25 million were at work. Though they had been replaced by rail for long-distance haulage and by steam engines for driving machinery, they still plowed fields, hauled wagons and carriages short distances, pulled boats on the canals, toiled in the pits, and carried armies into battle. But the arrival of the internal combustion engine in the late nineteenth century rapidly displaced these workers, so that by 1924 there were fewer than two million. There was always a wage at which all these horses could have remained employed. But that wage was so low that it did not pay for their feed.

As technology continues to advance in the second half of the chessboard, taking on jobs and tasks that used to belong only to human workers, one can imagine a time in the future when more and more jobs are more cheaply done by machines than humans. And indeed, the wages of unskilled workers have trended downward for over 30 years, at least in the United States.

We also now understand that technological unemployment can occur even when wages are still well above subsistence if there are downward rigidities that prevent them from falling as quickly as advances in technology reduce the costs of automation. Minimum wage laws, unemployment insurance, health benefits, prevailing wage laws, and long-term contracts--not to mention custom and psychology--make it difficult to rapidly reduce wages. Furthermore, employers will often find wage cuts damaging to morale. As the efficiency wage literature notes, such cuts can be demotivating to employees and cause companies to lose their best people.

But complete wage flexibility would be no panacea, either. Ever-falling wages for significant shares of the workforce is not exactly an appealing solution to the threat of technological employment. Aside from the damage it does to the living standards of the affected workers, lower pay only postpones the day of reckoning. Moore's Law is not a one-time blip but an accelerating exponential trend.

The threat of technological unemployment is real. To understand this threat, we'll define three overlapping sets of winners and losers that technical change creates: (1) high-skilled vs. low-skilled workers, (2) superstars vs. everyone else, and (3) capital vs. labor. Each set has well-documented facts and compelling links to digital technology. What's more, these sets are not mutually exclusive. In fact, the winners in one set are more likely to be winners in the other two sets as well, which concentrates the consequences.

In each case, economic theory is clear. Even when technological progress increases productivity and overall wealth, it can also affect the division of rewards, potentially making some people worse off than they were before the innovation. In a growing economy, the gains to the winners may be larger than the losses of those who are hurt, but this is a small consolation to those who come out on the short end of the bargain.

Ultimately, the effects of technology are an empirical question--one that is best settled by looking at the data. For all three sets of winners and losers, the news is troubling. Let's look at each in turn.

1. High-Skilled vs. Low-Skilled Workers

    We'll start with skill-biased technical change, which is perhaps the most carefully studied of the three phenomena. This is technical change that increases the relative demand for high-skill labor while reducing or eliminating the demand for low-skill labor. A lot of factory automation falls into this category, as routine drudgery is turned over to machines while more complex programming, management, and marketing decisions remain the purview of humans.

    A recent paper by economists Daron Acemoglu and David Autor highlights the growing divergence in earnings between the most-educated and least-educated workers. Over the past 40 years, weekly wages for those with a high school degree have fallen and wages for those with a high school degree and some college have stagnated. On the other hand, college-educated workers have seen significant gains, with the biggest gains going to those who have completed graduate training (Figure 3.5).

    What's more, this increase in the relative price of educated labor--their wages--comes during a period where the supply of educated workers has also increased. The combination of higher pay in the face of growing supply points unmistakably to an increase in the relative demand for skilled labor. Because those with the least education typically already had the lowest wages, this change has increased overall income inequality.

    wages productivity inequality.png

    It's clear from the chart in Figure 3.5 that wage divergence accelerated in the digital era. As documented in careful studies by David Autor, Lawrence Katz, and Alan Krueger, as well as Frank Levy and Richard Murnane and many others, the increase in the relative demand for skilled labor is closely correlated with advances in technology, particularly digital technologies. Hence, the moniker "skill-biased technical change," or SBTC. There are two distinct components to recent SBTC. Technologies like robotics, numerically controlled machines, computerized inventory control, and automatic transcription have been substituting for routine tasks, displacing those workers. Meanwhile other technologies like data visualization, analytics, high-speed communications, and rapid prototyping have augmented the contributions of more abstract and data-driven reasoning, increasing the value of those jobs.

    Skill-biased technical change has also been important in the past. For most of the 19th century, about 25% of all agriculture labor threshed grain. That job was automated in the 1860s. The 20th century was marked by an accelerating mechanization not only of agriculture but also of factory work. Echoing the first Nobel Prize winner in economics, Jan Tinbergen, Harvard economists Claudia Goldin and Larry Katz described the resulting SBTC as a "race between education and technology." Ever-greater investments in education, dramatically increasing the average educational level of the American workforce, helped prevent inequality from soaring as technology automated more and more unskilled work. While education is certainly not synonymous with skill, it is one of the most easily measurable correlates of skill, so this pattern suggests that demand for upskilling has increased faster than its supply.

    Studies by this book's co-author Erik Brynjolfsson along with Timothy Bresnahan, Lorin Hitt, and Shinku Yang found that a key aspect of SBTC was not just the skills of those working with computers, but more importantly the broader changes in work organization that were made possible by information technology. The most productive firms reinvented and reorganized decision rights, incentives systems, information flows, hiring systems, and other aspects of organizational capital to get the most from the technology. This, in turn, required radically different and, generally, higher skill levels in the workforce. It was not so much that those directly working with computers had to be more skilled, but rather that whole production processes, and even industries, were reengineered to exploit powerful new information technologies. What's more, each dollar of computer hardware was often the catalyst for more than $10 of investment in complementary organizational capital. The intangible organizational assets are typically much harder to change, but they are also much more important to the success of the organization.

    As the 21st century unfolds, automation is affecting broader swaths of work. Even the low wages earned by factory workers in China have not insulated them from being undercut by new machinery and the complementary organizational and institutional changes. For instance, Terry Gou, the founder and chairman of the electronics manufacturer Foxconn, announced this year a plan to purchase 1 million robots over the next three years to replace much of his workforce. The robots will take over routine jobs like spraying paint, welding, and basic assembly. Foxconn currently has 10,000 robots, with 300,000 expected to be in place by next year.

    Jump to comments
    Presented by

    Erik Brynjolfsson and Andrew McAfee

    Erik Brynjolfsson is the chair of the MIT Sloan Management Review, where Andrew McAfee is a principal research scientist. They are the co-authors of the new book Race Against the Machine. More

    Andrew McAfee, a principal research scientist at the Massachusetts Institute of Technology (MIT), studies the ways that information technology (IT) affects businesses. He coined the phrase 'Enterprise 2.0'; his book on the topic was published in 2009 by Harvard Business School Press. He has also held appointments as a professor at Harvard Business School and a fellow at Harvard's Berkman Center for Internet and Society.

     

    Erik Brynjolfsson is the director of the MIT Center for Digital Business, the Schussel Family Professor at the MIT Sloan School of Management, and chair of the MIT Sloan Management Review. His research examines the effects of information technologies on business strategy, productivity, Internet commerce, pricing models, and intangible assets. Brynjolfsson is coauthor of Wired for Innovation: How IT is Reshaping the Economy (MIT Press, September 2009).

    Get Today's Top Stories in Your Inbox (preview)

    Why Do People Love Times Square?

    A filmmaker asks New Yorkers and tourists about the allure of Broadway's iconic plaza


    Join the Discussion

    After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

    Video

    Why Do People Love Times Square?

    A filmmaker asks New Yorkers and tourists about the allure of Broadway's iconic plaza

    Video

    A Time-Lapse of Alaska's Northern Lights

    The beauty of aurora borealis, as seen from America's last frontier

    Video

    What Do You Wish You Learned in College?

    Ivy League academics reveal their undergrad regrets

    Video

    Famous Movies, Reimagined

    From Apocalypse Now to The Lord of the Rings, this clever video puts a new spin on Hollywood's greatest hits.

    Video

    What Is a City?

    Cities are like nothing else on Earth.

    Writers

    Up
    Down

    More in Business

    Just In