Self-fulfilling rumors of ethnic violence spread like a virus across the newly wired India, sending 300,000 citizens fleeing and leading the government to extreme measures.
Smoke hangs over Mumbai at the scene of a violent protest by Muslims in response to unfounded rumors of anti-Muslim violence in a distant region. (AP)
Technology can be a great liberator, but can it sometimes be a public menace? The Indian government seems to think so: it has blocked around 250 websites, ordered Google and Facebook to pull content, threatened legal action against Twitter if it doesn't delete certain accounts, and has arrested several people for sending inflammatory text messages, all in the name of public safety. If you're appalled, you're not alone: the U.S. State Department responded by calling on India to respect "full freedom of the internet," highlighting the growing divide between the two governments on web freedom.
But the Indian censorship -- and it is censorship, despite the government's insistance otherwise -- may not be as clear-cut as a case of state oppression and over-reach. It turns out that the Indian government might be right to fear that technology, for all the very real benefits it's brought India, could also be helping to magnify ancient communal tensions in a ways that costs lives and, perhaps even worse, might destabilize the delicate social balance within the world's second-largest country.
The story begins, depending on how you look at it, either 20 years, one month, or one week ago. In 1993, two ethnic groups in the far-northeastern Indian state of Assam clashed over who had more of a right to the land: members of the local Bodo tribe won, and the Muslim Indians lost, fleeing into refugee camps. Last month, that conflict resurfaced, as it periodically does, when a few migrants from Assam got beaten up near the far-away city of Mumbai. No one really knows what happened, but the public perception seems to be that some of Mumbai's Muslims had attacked the Bodo migrants as revenge for the 1993 crisis. Then, last week, two sets of equally dangerous rumors spread across India: that Muslims throughout the country were about to attack northeastern migrants, and, in apparent response, that Bodo in their home-state of Assam were planning a pre-emptive strike on the area's Muslims.
That the two rumors appear to have been almost certainly unfounded is beside the point: they were mutually reinforcing. The more that people heard about them, the truer they became. Muslims, fearing their fellow believers in Assam were in mortal peril, staged a large protest in Mumbai. Northeastern migrants in the area, afraid the re-opening communal tensions could put them at risk, fled. Hearing about this back in Assam, some northeasterners perceived it as proof of coming Muslim violence, and, apparently enraged, attacked the region's Muslims. It's not hard to see how things spiraled out of control from there. By the end of the weekend, northeastern migrants were streaming onto trains to head home to Assam, and Muslims in Assam were fleeing en masse to refugee camps.
Technology didn't cause any of this, of course. But social media and text messaging, both of which are becoming increasingly common in reaches of India's enormous lower and middle classes, accelerated the flow of rumors and of inflammatory images. Some of the material turns out to have been fake: doctored images and videos showed anti-Muslim attacks that never happened. Because the rumors can be self-fulfilling, their lightening-fast spread across India's vast population, much of which is very newly connected to the web, can be costly. The original 1993 crisis displaced an estimated 20,000 people, but this most recent manifestation has already displaced 300,000, and killed 80. No doubt there are many factors that might explain the new severity of this old crisis, but with the spread of rumors apparently playing a significant role, the recent explosion in Indian Internet access rates (the 100 millionth Indian web users logged on in December) could be relevant. The government, unable to counter the destabilizing rumors, shut down some of the means of their dispersal.
Whether or not the Indian government's censorship does anything to calm this crisis, their apparent desperation is understandable. Still, India's readiness to censor the web is part of the government's longer-running effort to regulate the Internet, to which Western governments and web freedom advocates have strenuously objected. Some of India's sweeping restrictions compel web companies like Google and Facebook to self-police, and then self-censor, any content that could be perceived as blasphemous or offensive to ethnic groups. Protesters in India decry the restrictions as extreme, and they're not wrong.
When world governments in places like Ethiopia or China censor the internet, they tend to cite some version of the same basic idea: free discussion is a threat to "national stability." Typically, web freedom activists perceive this as little more than an excuse for online authoritarianism, and they're probably often correct. But what if, in India's case, the government could actually be right? Can Photoshopping up some "evidence" of ethnic attacks be akin to inciting violence? What about sending a text message falsely claiming such attacks, for which a Bangalore man was arrested? At what point does a Facebook rumor become a cry of "fire" in the crowded theatre of Indian ethnic anxieties?
Walter Russel Mead, writing on the ongoing crisis, called India's long-running communal tensions "the powder keg in the basement." With the already-dangerous risk of ethnic combustion heightened by a population with easy access to rumors and an apparent predisposition to believing them, maybe that powder keg justifies Indian censorship. Or maybe it doesn't; free speech is its own public good and public right, and, in any case, censoring discussion of such sensitive national issues could make it more difficult for India to actually confront them. This is just one of the many difficult questions that Indian leaders will grapple with as hundreds of thousands of their citizens flee their homes, chased out by "a swirl of unfounded rumors." I don't envy them.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
University leaders and observers discuss the intersection of student protests, free speech and academic freedom.
In a Thursday debate titled “Academic Freedom, Safe Spaces, Dissent, and Dignity,” faculty or administrators from Yale, Wesleyan, Mizzou, and the University of Chicago discussed last semester’s student protests and their intersection with free speech. They shared the stage at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, with Jonathan Greenblatt of the Anti-Defamation League; Kirsten Powers, author of The Silencing: How the Left Is Killing Free Speech; and Greg Lukianoff, who leads the Foundation for Individual Rights in Education.
My colleague Jeffrey Goldberg was the moderator.
The most interesting exchange involved Stephen Carter, a law professor at Yale, and Michael S. Roth, the president of Wesleyan University.
People in Great Britain felt their leaders weren’t treating them fairly. Politicians in the U.S. should take note.
Britain’s Brexit vote has shocked the political elites of both the U.S. and Europe. The vote wasn’t just about the EU; in fact, polls before the referendum consistently showed that Europe wasn’t top on voters’ lists of concerns. But on both sides of the Atlantic Ocean, large numbers of people feel that the fundamental contracts of capitalism and democracy have been broken. In a capitalist economy, citizens tolerate rich people if they share in the wealth, and in a democracy, they give their consent to be governed if those governing do so in their interest. The Brexit vote was an opportunity for people to tell elites that both promises have been broken. The most effective line of the Leave campaign was “take back control.” It is also Donald Trump’s line.
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.