How should we tell the story of the digital century, now two decades old? We could focus, as journalists tend to do, on the depredations of the connected life. As Facebook, Twitter, and YouTube have devoured the online world, they have undermined traditional media, empowered propagandists, and widened America’s political divides. The smartphone, for all its wonder and utility, has also proved to be a narcotizing agent.
But what if, instead of focusing on Big Tech’s sins of commission, we paid equal attention to its sins of omission—the failures, the busts, the promises unfulfilled? The past year has offered several lurid examples. WeWork, the office-sharing company that claimed it would reinvent the workplace, imploded on the brink of a public offering. Uber, once seen as an unstoppable force that would transform urban transit as radically as the subway had, has likewise seen its public valuation plummet. From January to October, the two firms together lost $10 billion.
While these companies might seem like outliers, their struggles hold a message, not just for investors but for all of us. Big Tech continues to find new and profitable ways to sell ads and cloud space, but it has failed, often spectacularly, to remake the world of flesh and steel.
For decades, we’ve turned to Silicon Valley to show us the future of American endeavor. Optimism flowed from the Bay Area’s evangelists but also from Washington. “In the new economy, human invention increasingly makes physical resources obsolete,” President Ronald Reagan said in a 1988 speech that heralded the promise of the computer chip. In the ’80s and ’90s, Democrats such as Al Gore made up a new generation of liberals—named “Atari Democrats,” after the early video-game company—who believed computer technology would provide opportunity on the scale of the New Deal. The internet age was hailed as a third industrial revolution—a spur for individual ingenuity and an engine of employment.
On these counts, it has not delivered. To the contrary, the digital age has coincided with a slump in America’s economic dynamism. The tech sector’s innovations have made a handful of people quite rich, but it has failed to create enough middle-class jobs to offset the decline of the country’s manufacturing base, or to help solve the country’s most pressing problems: deteriorating infrastructure, climate change, low growth, rising economic inequality. Tech companies that operate in the physical world, such as Lyft and DoorDash, offer greater convenience, but they hardly represent the kind of transformation that Reagan and Gore had in mind. These failures—perhaps more than the toxicity of the web—underlie the meanness and radicalism of our era.
Decades from now, historians will likely look back on the beginning of the 21st century as a period when the smartest minds in the world’s richest country sank their talent, time, and capital into a narrow band of human endeavor—digital technology. Their efforts have given us frictionless access to media, information, consumer goods, and chauffeurs. But software has hardly remade the physical world. We were promised an industrial revolution. What we got was a revolution in consumer convenience.
The original Industrial Revolution freed humanity from the centuries-long prison of slow economic growth. In the early 19th century, productivity and income were skyrocketing, first in England and soon throughout Europe. While the transition was brutal for many, the gains were broadly shared: Real wages for the working class doubled in the first half of the century, and life expectancy at birth rose dramatically in the second half.
In the computer age, the economy has trended in the opposite direction. If American productivity had continued to grow as it did from Harry Truman’s election to Richard Nixon’s resignation, the 2013 economy would have been about 60 percent larger. (Dividing those gains equally would have given the typical middle-class household a bonus of roughly $30,000 a year.) Instead, income growth from 1973 to 2013 was 80 percent slower.
Technology’s defenders claim that the traditional tools of macroeconomics can’t possibly capture the magic of a smartphone—a single device that can function as a camera, a gaming console, a portal to the web, and, yes, a telephone. Look up from your textbooks, they tell economists: Everything is getting better except our ability to measure how much better everything is getting.
But no matter how aggressively you torture the numbers, the computer age has coincided with a decline in the rate of economic growth. When Chad Syverson, an economist at the University of Chicago’s business school, looked at the question of “missing” growth, he found that the productivity slowdown has reduced GDP by $2.7 trillion since 2004. Americans may love their smartphones, but all those free apps aren’t worth trillions of dollars.
And if you look up from your smartphone, progress becomes harder to see. The physical world of the city—the glow of electric-powered lights, the rumble of automobiles, the roar of airplanes overhead and subways below—is a product of late-19th-century and early-20th-century invention. The physical environment feels depressingly finished. The bulk of innovation has been shunted into the invisible realm of bytes and code.
All of that code, technology advocates argue, has increased human ingenuity by allowing individuals to tinker, talk, and trade with unprecedented ease. This certainly feels true. Who could dispute the fact that it’s easier than ever to record music, market a video game, or publish an essay? But by most measures, individual innovation is in decline. In 2015, Americans were far less likely to start a company than they were in the 1980s. According to the economist Tyler Cowen, the spread of broadband technology has corresponded with a drop-off in entrepreneurial activity in almost every city and in almost every industry.
One explanation for the decrease in innovation leads right back to Silicon Valley. Tech’s biggest winners have effectively built monopolies, whether in office software (Microsoft), social media (Facebook), or search advertising (Google). Rather than foster innovation, the tech giants have grown so large that they scare off entrepreneurs in their path. Venture capitalists have a term for the menacing shadow cast by tech Goliaths: the kill zone. The outsize power of tech’s biggest companies has worsened regional inequality, concentrating wealth in the handful of metro areas where they’ve set up shop. Eighty percent of venture-capital investment goes to just three states—California, New York, and Massachusetts. The internet’s tools were supposed to shatter legacy empires, free untapped creativity, and spread wealth. Instead, the tech powers have become as cutthroat and anticompetitive as the companies they once aimed to replace.
For decades, the tech community has waved away its humdrum macroeconomic impact by touting some imminent leap forward. Take, for example, self-driving cars, which would replace flawed human drivers with huge fleets of vehicles steered by cameras and computers, saving lives and creating a new manufacturing industry. As recently as April, Elon Musk predicted that 1 million “robotaxis” would be on the road by 2020, and his optimism has been shared by automakers and technology companies alike. Yet progress on self-driving cars has been stubbornly slow. Encoding in computers visual and manual skills sharpened by millennia of human evolution is no easy task. But this is precisely the kind of near-miraculous accomplishment that Silicon Valley has long promised.
Ironically, the most visible consumer-tech innovation of the past decade hasn’t been computers driving cars but rather contractors driving cars. We’ve seen an explosion of companies that allow consumers to summon products and services to their door, whether it’s food (DoorDash), a handyman (TaskRabbit), or a ride (Uber and Lyft). Goods in this so-called platform economy tend to be shepherded around by workers whose part-time status allows the platforms to avoid providing full benefits, including health insurance. These bargain-rate services make yuppie life convenient. But far from improving transit or enriching workers, these companies exacerbate congestion, deplete resources for public transportation, and entrench urban inequality. Is this what passes for real-world progress in the digital age?
Here’s a fair objection: What if everything I’ve told you about the slowdown in progress and human ingenuity is true—but it’s not Silicon Valley’s fault?
“I think we should be much more disappointed about the slowdown in progress than most people tend to be,” says Patrick Collison, a co-founder and the CEO of the financial-technology company Stripe. “But the slowdown predates the internet, and I still consider the digital revolution, viewed in its totality, a very bright spot in the broader picture of the last 50 years. The status quo, and our broader societal ability to generate innovation, is almost certainly in need of significant change. But if we’re not producing enough gold, it’s important that we blame the geese, not the eggs.”
Collison is right that Big Tech shouldn’t be blamed for all negative economic indicators, many of which are the fault of bad governance, the difficulties of improving productivity in industries such as energy and home construction, and other factors. The digital revolution has in many ways ameliorated the broader slowdown. Yes, geographic mobility is in decline, but the internet has made remote work more feasible. Yes, air travel is no faster than it was 30 years ago (and, in some cases, even slower), but travel-comparison sites have made fares cheaper and in-flight Wi-Fi has made flights more productive.
But letting Silicon Valley off the hook would also be a mistake. The tech sector today bestrides the U.S. economy like a colossus. Data from the Computing Research Association show that from 2013 to 2017, the number of people majoring in computer science more than doubled. According to data from PitchBook, software has a powerful hold over U.S. venture capital, with more than 3,700 deals in 2018; pharmaceuticals and biotech came in a distant second, with just 720 deals. From an R&D standpoint, tech’s supremacy is without precedent. In a paper reviewing the history of U.S. innovation through patent filings, the economists Mikko Packalen and Jay Bhattacharya found that previous inventive sprints generated an explosion in patent filings across several categories—chemistry, electronics, medicine, and mechanical engineering. By contrast, U.S. patents since 2000 have been dominated by computers and communication tech. America’s innovative talents have devolved from versatility to specialization. If we’re going to concentrate so many resources in one sector, that sector had better produce.
Perhaps it’s time to reconsider the wisdom of placing such a big bet on Silicon Valley delivering the United States from its rusting present to a glimmering future. Too much American ingenuity is chasing problems that simply don’t matter. The web was once celebrated as a democratizing force and a means of escaping institutional control. But Silicon Valley’s most profitable business model has been to construct expansive systems for tracking and manipulating human behavior: Together, Facebook and Google make almost 90 percent of their revenue by selling ads. As the big problems continue to go unsolved, tech’s advertising duopoly amounts to roughly $1.5 trillion in market capitalization.
“The internet age has been very underwhelming compared to what the expectations were,” the economist and aerospace entrepreneur Eli Dourado told me. “I’m also worried that it’s sapping talent from other industries that might benefit from more innovation. All these people building apps and software-as-a-service companies, if they applied themselves to challenges in the physical world—especially on energy, housing, health, and transportation—they could make a real difference.”
Dourado doesn’t think we’ve run out of ideas, but rather that our once-grand ambitions have narrowed to focus on a handful of reliably profitable endeavors, such as ad tech and cloud services—the low-hanging fruit. He advocates for a national project to reach for the higher-up fruit. Silicon Valley could deepen its investments in biotech (which could transform preventive care and disease detection) and construction automation (which could bring down the price of new housing and transit). With help from the federal government, tech could also play a larger role in solving the greatest challenge in human history: climate change. Carbon-capture systems, which remove carbon dioxide from the atmosphere, could slow the rate of global warming while adding hundreds of thousands of jobs. In 2019, the Department of Energy announced more than $150 million in federal funding for carbon-capture R&D. That’s not nothing, but consider that, at its peak, the Apollo moon program siphoned off more than 2 percent of federal spending—the equivalent of nearly $100 billion today.
The idea that Silicon Valley could swoop in and solve all of America’s problems was always an illusion, conjured by technologists seeking to lure capital to California, and by politicians looking to shift responsibility away from Washington. Silicon Valley has a crucial role to play in meeting the challenges of the new century, but it can’t act alone. Transformative advances will require participation from local, state, and federal government, and from the American people, who for too long have bought into the idea that prosperity can be delivered in lines of code.
For the past two decades, we’ve funneled treasure and talent into the ethereal world of software and digital optimization. Imagine what could be accomplished if American ingenuity came back down to Earth.
This article appears in the January/February 2020 print edition with the headline “Where’s My Flying Car?”