As the cost of doing business in China rises, U.S. manufacturing could be on the verge of a renaissance. But that won't necessarily mean a flood of blue-collar jobs.
In the past year, the conversation about U.S. manufacturing has undergone a quiet but remarkable change. Gone is much of the doom and gloom about the death of American factories. Instead, many now seem certain that industry is due for a comeback here at home.
The latest murmurs of good news came last week, when Cook Associates released the results of survey finding that 85% of manufacturing executives expected at least some kinds of factory work to return to the U.S. from overseas. The firm polled roughly 3,000 executives at small and mid-size manufacturers, about two-thirds of whom said their companies were currently manufacturing or outsourcing work abroad.
What could drive the revival? Rising wages in China, to start. Workers there are still cheap -- in the country's southern manufacturing hub, they earn just 75 cents an hour -- but they're not as cheap as they used to be. According to the American Institute for Economic Research, the average hourly wage in China doubled between 2002 and 2008. The country's currency has also risen gradually since 2005, from about 12 cents per yuan up to roughly 15 cents.
Pile on the logistical headaches that come from coordinating operations across the Pacific, as well as high fuel costs that make shipping more expensive, and all this has some business people considering a move back to the states. For some kinds of work, at least. In August, Boston Consulting Group released a report predicting a global realignment in the manufacturing sector. By 2015, the firm believes that many kinds of production will be just as cheap in the U.S. as in China, especially in low volume, heavy goods where labor only makes up a small part of the cost equation. Those include products like car parts, construction equipment, and appliances. Not everything is moving home. Textile mills in South Carolina? Don't hold your breath.
Factories are about to disappear from Shenzen. They'll still be there, churning out iPods, TVs, and pretty much whatever else you can imagine. But they'll cater more to China's domestic market, which is expected to grow exponentially in the coming years. Meanwhile, factories would move back to the United States to build products for sale in North America.
So we could one day be seeing more made in the USA labels. But how many more American workers will be stamping them on? That's where things become tricky. One of the great misconceptions about America's manufacturing decline is that the country no longer builds things. That's simply not true. As the BCG report notes, the value of U.S. output increased by a third between 1997 and 2008, a period when the economy shed millions of manufacturing jobs. The culprit: productivity.
U.S. factories simply need workers than in the past. We've become exceptionally good at making products using very little labor and lots of machines. Think of that GM Superbowl ad with the oddly sympathetic robot arm that starts moping after it drops a bolt. That sulking hunk of metal is the real face of most U.S. factories.
Of course, someone has to operate all those robots. The increasing importance of technology on the factory floor has turned manufacturing into a high skill field, as the president of the Federal Reserve Bank of Cleveland noted in a recent speech. As manufacturers have laid off blue collar workers, they've been hiring more college grads. That means, even if BCG is right, and a return of U.S. factories creates 2 to 3 million domestic jobs, it won't be a cure-all for the problems that now afflict the labor market. Building things takes a degree. And the current jobs crisis has, more than anything, been about the plight of the undereducated male -- the kind of worker who increased productivity made redundant in the first place.
The return of more manufacturing would be a great boon for the U.S. But it doesn't mean yesterday's factory worker will get his job back.
“This western-front business couldn’t be done again.”
On this first day of July, exactly 100 years ago, the peoples of the British Empire suffered the greatest military disaster in their history. A century later, “the Somme” remains the most harrowing place-name in the annals not only of Great Britain, but of the many former dependencies that shed their blood on that scenic river. The single regiment contributed to the First World War by the island of Newfoundland, not yet joined to Canada, suffered nearly 100 percent casualties that day: Of 801 engaged, only 68 came out alive and unwounded. Altogether, the British forces suffered more than 19,000 killed and more than 38,000 wounded: almost as many casualties in one day as Britain suffered in the entire disastrous battle for France in May and June 1940, including prisoners. The French army on the British right flank absorbed some 1,600 casualties more.
Boris Johnson stabbed David Cameron in the back. Michael Gove stabbed David Cameron in the back. Michael Gove stabbed Boris Johnson in the back. It’s very simple.
“We have really everything in common with America nowadays,” Oscar Wilde wrote in the Canterville Ghost, “except, of course, language.” And, apparently, political intrigue.
In the United States, the political class has been stunned by the rise of a candidate who bested more than a dozen better-qualified rivals, partly by means of rhetoric as simplistic as monikers like “Little Marco” and “Lyin’ Ted.” But that’s amateur hour. The political machinations on display across the Atlantic in the wake of Britain’s historic vote to leave the European Union are far more sophisticated, and if politics is a game, America’s would be checkers to the U.K.’s three-dimensional chess.
On Thursday, Boris Johnson, the former London mayor who championed and ultimately won the vote for “Brexit,” stunned the political establishment by saying he wouldn’t seek to replace David Cameron as head of the ruling Conservative Party (and, consequently, take the prime ministership). But that only happened after Michael Gove, Johnson’s friend and ally in the “leave” campaign, put forth his own leadership bid instead. It was, as many on Twitter pointed out, a twist worthy of House of Cards (which, after all, was a British show to begin with). The British media, of course, found a way to class that reference up, with one headline saying Gove had “done a double Brutus.”
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
U.S. Education Secretary John King will argue that interactions with children from different backgrounds prepare students for the workforce.
Perhaps no U.S. education secretary has had more personal experience with the power America’s public-school system has to lift up students who have the odds stacked against them than John King. At least when it works as intended.
A Puerto Rican and African American whose parents had both passed away by the time he was 12, King has repeatedly credited New York public schools for saving his life and shaping its trajectory. King attended P.S. 276 in Canarsie and Mark Twain Junior High School in Coney Island, at the time both diverse schools that exposed him not only to high-quality curriculum, but to students and teachers from backgrounds and cultures wildly different from his own.
“As a kid, it gave me a sense of different cultural experiences that people had and different traditions that people had, and as a parent, that has been an important part of thinking about the schools for my daughters,” King said during an interview at his Washington, D.C., office.
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
University leaders and observers discuss the intersection of student protests, free speech and academic freedom.
In a Thursday debate titled “Academic Freedom, Safe Spaces, Dissent, and Dignity,” faculty or administrators from Yale, Wesleyan, Mizzou, and the University of Chicago discussed last semester’s student protests and their intersection with free speech. They shared the stage at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, with Jonathan Greenblatt of the Anti-Defamation League; Kirsten Powers, author of The Silencing: How the Left Is Killing Free Speech; and Greg Lukianoff, who leads the Foundation for Individual Rights in Education.
My colleague Jeffrey Goldberg was the moderator.
The most interesting exchange involved Stephen Carter, a law professor at Yale, and Michael S. Roth, the president of Wesleyan University.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
Sharing platforms are meant to scale seamlessly throughout the world, but they’ve faced a different knotty set of rules in nearly every city they’ve colonized.
For years now, Airbnb, the popular home-sharing platform, has featured this line of copy at the end of a company mission statement that mostly pledges to promote a sense of adventure and discovery: “And with world-class customer service and a growing community of users, Airbnb is the easiest way for people to monetize their extra space and showcase it to an audience of millions."
It’s a business model condensed into a coda, casually set off with an “And.” The subtext is that the revenue-making potential of the platform is an afterthought, which implies that its appeal lies in its ease of use. Sign up and rent out your apartment or guest room. It’s easy.
Easy, that is, unless you live in Chicago, where regulations passed last week will require hosts to register with the city, impose a tax on each transaction to pay for the city’s homeless services, and limit the number of apartments that can be rented out in a particular building, depending on its size. Or in San Francisco, Airbnb’s hometown, where a law that went into effect in 2015 limits the total number of days an apartment can be rented out per year and similarly requires hosts to register with the city. (This week, the company, which coincidentally helped draft the 2014 law, decided to sue the city over it.) Months after San Francisco imposed those limits, Santa Monica passed regulations requiring hosts to get business licenses and restricted them from renting out entire properties.