The movie 2012 opens Friday, predicated on the notion that on December 21, 2012, as the most recent Mayan long-form calendar cycle (5,125.366 years) comes to an end, along with a unique planetary/solar alignment and a high level of solar activity, the world will cataclysmicly end one era and enter another. With earthquakes, tsunamis, floods, and all sorts of devastating destruction in the process.
The movie is not the only source of prophetic notions of doom, or at least cataclysmic change, that are gaining increasing play and attention as 2012 approaches. There are books, websites, and even several other movies scheduled for release on the subject with all kinds of angles, from secular and New Age to religious and indigenous folk legend.
Now, even if it were true that the Mayans had predicted some apocalyptic ending of the world at the end of their long calendar cycle (they had several calendars and ways of marking time; that was just one of them), it's a bit odd that we'd grab onto that one particular prophesy and belief system of theirs. After all, the Mayans also believed in human sacrifice, and we don't exactly leap on board that train in attempting to maintain civic and theological order.
But according to Sandra Noble, executive director of the Foundation for the Advancement of Mesoamerican Studies, Inc. (FAMSI), the truth is, the Mayans didn't have any apocalyptic predictions for 2012. "There is NOTHING in ancient Maya records that predicts the end of the world; no apocalypse, no destruction, no cosmic clashes. Nothing," she says.
But, wait. What about Quetzacoatal returning and all that? Big sigh from the folks at FAMSI. In a fascinating paper available from the FAMSI website, Dr. Mark Van Stone, who has studied the Mayan culture for over a decade (and can read and write in Mayan hieroglyphs) provides an illuminating and entertaining cataloging of why all the doomsayers are off the mark and includes some great photo exhibits regarding the astronomical events scheduled for 12/21/12. Here's a sample, from his "9 Reasons Why The Mayan Prophecies Should Be Read Very Critically":
Though Aztec, Mixtec, and Maya sources provide us a number of narratives, different versions disagree. For example: the Aztec predict that this Creation will end on a 4-Movement day in a 2-Reed year, if it ends at all. The next possible Aztec end-date will be in 2027. Maya literature does not explicitly predict any end at all, and their so-called "end date" in 2012 is a 4-Ajaw [4-Flower in Aztec cycle] not 4-Movement. Mixtec Creation stories mention 2-Deer in 13-Rabbit, and other dates.
So perhaps in on the 2-Deer day in the 13-Rabbit year, under a 4-Flower Moon, we might have cause to worry -- except that it seems the Mayans never corrected written mistakes (the original, and literal, "carved in stone" approach). And the Aztec official responsible for a lot of how that culture's history was written apparently had a bit of a Machiavellian propaganda minister's streak in him. Which is to say, even what they did say should be taken with a handful of archeological salt.
December 21, 2012 is still a significant day for the Mayans. It's the equivalent of our Gregorian Calendar's December 31, 1999; the turning over of a new millennium and era of timekeeping. So it would be a big celebration. But that's about it. Of course, there were also a slew of predictions about disaster and doom surrounding our own "end of a cycle" mark at the end of 1999. None of which came true, as you may recall.
So why are we so drawn to these apocalyptic notions and prophecies of doom, gloom, and destruction (even if it eventually leads to a shining new era for the select few who are chosen or manage to survive)?
The answer apparently dates back to the very earliest days of human existence. "Apocalypticism," as it is academically known, arises from a deep evolutionary sense or need for social justice, according to Allen Kerkeslager, an associate professor in Religions of the Ancient World at St. Joseph's University in Philadelphia.
"The sense of social justice, fairness in dealing with each other, and a felt need to cooperate with each other was already in place long before our hominid ancestors reached the cognitive ability to reflect on it," Professor Kerkeslager says.
As long as humans lived in the relatively egalitarian hunting and gathering societies that dominated up until about 10,000 years ago, that need was sufficiently met and enforced, because the survival of the group depended on cooperation. But when humans moved into more agrarian societies with land ownership, where a more hierarchical structure evolved, disparities increased. So those who had less had to come up with a way to explain the differences and satisfy their need for an eventual leveling of the scales. Apocalypticism, according to Kerkeslager, fulfilled that need and gave people a way of still believing that the gods were good and fair, even in an unfair world.
"Typically," he explains, "[apocalypticism] involves claims to prophetic authority among the leaders of the movement, an emphasis on visions and other forms of direct experience with the gods, and prophecies of a future transformation of the world that will bring relief to the afflicted members of the apocalyptic group and destruction on their enemies."
Not surprisingly, the phenomenon typically springs up among groups who find themselves in the minority, threatened, or repressed unfairly--at least, in their own view of the world. The Christian Book of Revelation came about under perceived Roman repression of the fledgling faith. The Anabaptists of the 1500s came out of a society stressed by economic disparity between rich and poor. Native American cultures developed apocalyptic narratives in the 1880s and 1890s, when those cultures were in danger of annihilation.
Visions and prophecies have been found in writings dating as far back as 2,000 B.C., according to Kerkeslager, although not all cultures had an equal need for thunder and lightning delivery of justice. In a polytheistic culture like ancient Greece, the need for apocalyptic beliefs was less, because a multitude of warring gods could explain misfortune or disparity. You might simply be the casualty of a power struggle between Hera and Zeus.
But as cultures became monotheistic, the disconnect between a supposedly fair and just God, and an unjust world, became harder to explain away. Hence, Kerkeslager says, apocalyptic notions in the Hebrew Book of Daniel, which was written only three years after a Greek King named Antiochus had begun a brutal repression of the Jews in Jerusalem, including turning the Jewish Temple into a shrine for Zeus. The revolt of Jewish revolutionaries, including the restoration of the temple in 165 B.C. (the same year that the Book of Daniel was written) is the basis for the Jewish holiday of Hannukah. But at the end of the Book of Daniel, the author predicts that an apocalyptic end will come to the repressive Greeks 1,290 days after their desecration of the temple. Unfortunately, as with other apocalyptic prophecies, it didn't happen. So the last line of Daniel changes the date to 1,335 days.
The fact that that date, too, came and went, didn't seem to fluster believers, any more than a failure of the earth to end on January 1, 2000 has stopped people from believing that it might still happen in 2012.
"The stubborn and often surprising ability of apocalyptic groups to ignore or explain away the failures of their prophecies is one of the most well-known features of apocalyptic groups," Kerkeslager says--a phenomenon also known as "motivated reasoning," as I discussed in an earlier piece here.
So with all that knowledge and understanding, can we all breathe easy? Not quite. "The belief in an apocalyptic doomsday is still alive even in the most skeptical societies," Kerkeslager says, "because it is very much a real possibility ... The earth is about 4.5 billion years old, and has sometimes been characterized by global transformations that have indeed had an apocalyptic scope." Some of those events were natural disasters that caused mass extinctions. But many civilizations, he points out, have brought about their own extinction "by practices that exhausted their natural resources and gradually undermined their ability to sustain their own populations." Including, ironically enough, the ancient Mayans.
So perhaps the Mayans did leave us a prophecy or warning worth heeding. Just not the one everyone's talking about. But in director Roland Emmerich's defense, I have to admit that it would be a lot harder to make a blockbuster action-adventure-thriller out of recycling your grocery bags and developing renewable energy sources than something that results in an aircraft carrier on a tidal wave wiping out the White House. Which is something spinners of apocalyptic tales figured out long before there were aircraft carriers, movies, or really cool special effects.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
University leaders and observers discuss the intersection of student protests, free speech and academic freedom.
In a Thursday debate titled “Academic Freedom, Safe Spaces, Dissent, and Dignity,” faculty or administrators from Yale, Wesleyan, Mizzou, and the University of Chicago discussed last semester’s student protests and their intersection with free speech. They shared the stage at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, with Jonathan Greenblatt of the Anti-Defamation League; Kirsten Powers, author of The Silencing: How the Left Is Killing Free Speech; and Greg Lukianoff, who leads the Foundation for Individual Rights in Education.
My colleague Jeffrey Goldberg was the moderator.
The most interesting exchange involved Stephen Carter, a law professor at Yale, and Michael S. Roth, the president of Wesleyan University.
People in Great Britain felt their leaders weren’t treating them fairly. Politicians in the U.S. should take note.
Britain’s Brexit vote has shocked the political elites of both the U.S. and Europe. The vote wasn’t just about the EU; in fact, polls before the referendum consistently showed that Europe wasn’t top on voters’ lists of concerns. But on both sides of the Atlantic Ocean, large numbers of people feel that the fundamental contracts of capitalism and democracy have been broken. In a capitalist economy, citizens tolerate rich people if they share in the wealth, and in a democracy, they give their consent to be governed if those governing do so in their interest. The Brexit vote was an opportunity for people to tell elites that both promises have been broken. The most effective line of the Leave campaign was “take back control.” It is also Donald Trump’s line.
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.