Since colonialism brought Western and Islamic societies crashing together over a century ago, the former has struggled to understand the rage it seems to provoke in the latter.
A protester rests on a barricade near the U.S. embassy in Cairo, Egypt. (Reuters)
In August 1857, a century before the United Nations would declare the Israeli state in what had been Palestine, before British and French diplomats would formally carve up the Middle East, before the U.S. would back a coup in Iran, before political Islamism would emerge, and before the U.S. would arm unmanned airplanes to kill Islamism's most violent and radical adherents, the British empire found itself besieged by Muslim protesters.
Officers at Fort William, in the Indian city of Calcutta, were the first to require colonial troops to grease their rifles with a compound that included cow and pig fat, a mixture guaranteed to offend both Hindus and Muslims. Many of the troops, known as sepoys, protested. The protests spread and turned violent, growing into an uprising that affected much of the British Raj at a time long before it was unified by roads or telephones, much less cell phones or the Internet. To give a sense of scale, the Raj covered about 4 million square kilometers; the countries of today's European Union make up 4.3 million.
Looking back, the cause-and-effect between the animal grease and the protests might seem obvious today, but it shocked British overseers at the time, and historians still dispute the larger causes, which seem to go well beyond just the pig fat offense. "Muslim activists called the mutiny a jihad, and their well-organized assaults suggested that the bullet-grease issue had merely been the spark," Tamim Ansary wrote in his book, Destiny Disrupted: A History of the World Through Islamic Eyes. The suddenness and vociferousness of 1857's Muslim protests, in what was then the country with the largest Muslim population in the world, as well as the West's struggle to either foresee or understand their anger, have their echoes in this past week's demonstrations against the U.S. over the anti-Islam film Innocence of Muslims.
As the Western world once again endeavors to understand the roots of apparently anti-Western rage that have again surfaced in large parts of the Muslim world, it's worth remembering the history of offense and backlash that has been a recurring theme of their intersections. Ansary's history of the 1857 Sepoy Rebellion cited "the cultural gulf between the British officers and their [Indian] foot soldiers, a gulf that had not existed before Europeans arrived." Then, as now, Western observers looked for causes political and cultural, particular to this uprising in this moment and general to the region and its history. They've found plenty: economic disenfranchisement among certain classes, conversion anxieties, political manipulation, local factors, and of course foreign domination, among many others. Islam and its followers came under special scrutiny, also like today, although the fact that so many Hindus participated suggests that the particularities of this one religion were not a good lens for understanding the rebellion.
It's entirely possible, even likely, that there is truth to a number of these theories, just as with the sometimes similar and sometimes different theories of "Muslim Rage," to borrow from a 1990 Atlanticcover story, that Westerns have explored so many times before. We've had many opportunities to theorize: the 2010 Florida Koran burning protests, the 2005 Muhammad cartoon protests, the wide 1990 demonstrations in support of Saddam Hussein that shocked the West, the near-global violence over Salman Rushdie's 1988 novel The Satanic Verses, and the deadly 1979 U.S. embassy attacks in Iran, Libya, and Pakistan. Protests and anger marked much of the colonial era as well, from the 1936 Arab Uprising in then-Palestine to the 1857 Sepoy Rebellion to the 1879 Urabi Revolt in Egypt.
It's worth considering the extent to which these movements have been connected by themes that can both encompass and be larger than the particularities of each. Many in the Middle East and South Asia are in fact furious with the U.S. for its drone program, but their anger and suspicion look awfully similar to those propelling the demonstrations in, for example, 1979 or 1988 or 1990 or 2005, during most of which drones did not exist. To say that Muslims are protesting because they're angry about drones is true in a similar way that, for example, San Francisco Democrats are likely to vote against Mitt Romney in November because they dislike his stance on gay marriage, or that people in China are protesting Japan because they disagree with Tokyo's claim over some disputed islands.
There is probably no simple, single explanation for something as old, complicated, and variegated as the anger in parts of the Muslim world against the West. Not even colonialism, perhaps the single most significant interaction between the Western and Muslim worlds since the Renaissance, is a satisfactory explanation: why, then, do the harshly colonized societies of sub-Saharan Africa report some of the highest approval ratings for American leadership in the world? (Before you answer "because oil" or "because Islam," keep in mind the Angola and Nigeria are enormous oil exporters to the U.S., and that much of Africa is Muslim.)
Perhaps the single most consistent theme in the anti-Western protests and incidents that we so often term "Muslim rage" is our perennial struggle to understand them. "Why do they hate us?" is a question we've been asking for a long time. Judging by some of the protest signs dotting Africa and Asia last week, demanding Western respect for Islam and its adherents, it might be a question that many Muslims ask of us, too. None of this is to advance a specific theory for last week's protests or the anger behind them, but rather to place them within the much longer history of offense and outrage between the Western and Muslim worlds, a generations-old mutual misapprehension that has long defied the sorts of easy answers that we might be tempted to reach for today.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
ASPEN, Colo.—At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
As he prepares for a presidential run, the governor’s labor legacy deserves inspection. Are his state’s “hardworking taxpayers” any better off?
This past February, at the Conservative Political Action Conference (CPAC) outside Washington, D.C., Wisconsin Governor Scott Walker rolled up his sleeves, clipped on a lavalier microphone, and without the aid of a teleprompter gave the speech of his life. He emerged from that early GOP cattle call as a front-runner for his party’s nomination for president. Numerous polls this spring placed him several points ahead of former Florida Governor Jeb Bush, the preferred candidate of the Republican establishment, in Iowa and New Hampshire. Those same polls showed him with an even more substantial lead over movement conservative favorites such as Ted Cruz, Rand Paul, and Mike Huckabee. In late April, the Koch brothers hinted that Walker would be the likely recipient of the nearly $900 million they plan to spend on the 2016 election cycle.
The untold story of the improbable campaign that finally tipped the U.S. Supreme Court.
On May 18, 1970, Jack Baker and Michael McConnell walked into a courthouse in Minneapolis, paid $10, and applied for a marriage license. The county clerk, Gerald Nelson, refused to give it to them. Obviously, he told them, marriage was for people of the opposite sex; it was silly to think otherwise.
Baker, a law student, didn’t agree. He and McConnell, a librarian, had met at a Halloween party in Oklahoma in 1966, shortly after Baker was pushed out of the Air Force for his sexuality. From the beginning, the men were committed to one another. In 1967, Baker proposed that they move in together. McConnell replied that he wanted to get married—really, legally married. The idea struck even Baker as odd at first, but he promised to find a way and decided to go to law school to figure it out.
Many authors have been tempted into writing revisionist histories of the 37th U.S. president, but these counterintuitive takes often do not hold up under closer scrutiny.
Every once in a while someone writes a book arguing that Richard Nixon has been misunderstood. These authors tend to focus on some particular aspect of his presidency that, the argument goes, is more important than that Watergate business. They’ve focused on his domestic policy or his foreign policy as achievements that override his flaws and his presidency’s denouement. Nixon’s highly complex persona also has led to books that probe his psyche—a hazardous and widely debunked practice, though that hasn’t discouraged further attempts.
And, as with other major figures, but all the more so given the drama of his time on the national stage, Nixon’s complexity and essentially low repute tempts some authors to offer revisionist approaches to his place in history. Such approaches have to be assessed on their own merits, not accepted merely because they’re counterintuitive or receive a lot of attention, as new assessments of the controversial and fascinating Nixon tend to do. Two major revisionist books about Nixon argued that his domestic policy was so expansive, humane, and innovative that it overrides his unfortunate behavior; their accounts relegate Watergate to a far less important role. The problem with these books is that they don’t stand up to close scrutiny.
Mike Huckabee and Ted Cruz are suggesting there might be ways for states and cities to nullify the justices’ ruling. They’re wrong.
The Supreme Court’s decision last week did make gay marriage legal around the nation. Unfortunately for social conservatives, it did not, however, make nullification legal around the nation.
Nullification is the historical idea that states can ignore federal laws, or pass laws that supercede them. This concept has a long but not especially honorable pedigree in U.S. history. Its origins date back to antebellum America, where Southern states tried to nullify tariffs and Northern states tried to nullify fugitive-slave laws. In the 1950s, after Brown v. Board of Education, some Southern states tried to pass laws to avoid integrating schools. It didn’t work, because nullification is not constitutional.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The social network learns more about its users than they might realize.
Facebook, you may have noticed, turned into a rainbow-drenched spectacle following the Supreme Court’s decision Friday that same-sex marriage is a Constitutional right.
By overlaying their profile photos with a rainbow filter, Facebook users began celebrating in a way we haven't seen since March 2013, when 3 million peoplechanged their profile images to a red equals sign—the logo of the Human Rights Campaign—as a way to support marriage equality. This time, Facebook provided a simple way to turn profile photos rainbow-colored. More than 1 million people changed their profile in the first few hours, according to the Facebook spokesperson William Nevius, and the number continues to grow.
“This is probably a Facebook experiment!” joked the MIT network scientist Cesar Hidalgo on Facebook yesterday. “This is one Facebook study I want to be included in!” wrote Stacy Blasiola, a communications Ph.D. candidate at the University of Illinois, when she changed her profile.
Was the Concorde a triumph of modern engineering, a metaphor for misplaced 20th-century values, or both?
The box sat untouched in his bottom desk drawer. For weeks we discussed opening it, and one January morning he was ready. I set the box on his white bedsheets and removed the stack of passports, which could have belonged to a family with dual citizenship. But all nine—from 1956 to a valid update issued in 2014—belong to my 89-year-old grandfather.
Lying in bed, he unfolded a stamp-covered page like an accordion and held it open above his chest. “Oh my,” he kept repeating. He paused, and pointed.
London. March 22, 1976. My then-50-year-old grandfather, Raymond Pearlson, the inventor ofSyncrolift, was traveling the world selling his shiplift system. Concorde had launched commercially that January. He knew exactly what this stamp represented: Washington Dulles to London Heathrow in 3.5 hours—the first of at least 150 supersonic flights he took on the legendary aircraft.
Engineers at IBM and Google claim they're closer than ever to making computers that could process data in days that would take millions of years to flow through today's machines.
One of the first electronic, programmable computers in the world is remembered today mostly by its nickname: Colossus. The fact that this moniker evokes one of the seven wonders of the ancient world is fitting both physically and conceptually. Colossus, which filled an entire room and included dinner-plate-sized pulleys that had to be loaded with tape, was built in World War II to help crack Nazi codes. Ten versions of the mammoth computer would decrypt tens of millions of characters of German messages before the war ended.
Colossus was a marvel at a time when “computers” still referred to people—women, usually—rather than machines. And it is practically unrecognizable by today's computing standards, made up of thousands of vacuum tubes that contained glowing hot filaments. The machine was programmable, but not based on stored memory. Operators used switches and plugs to modify wires when they wanted to run different programs. Colossus was a beast and a capricious one at that.
I spent a year in Tromsø, Norway, where the “Polar Night” lasts all winter—and where rates of seasonal depression are remarkably low. Here’s what I learned about happiness and the wintertime blues.
Located over 200 miles north of the Arctic Circle, Tromsø, Norway, is home to extreme light variation between seasons. During the Polar Night, which lasts from November to January, the sun doesn’t rise at all. Then the days get progressively longer until the Midnight Sun period, from May to July, when it never sets. After the midnight sun, the days get shorter and shorter again until the Polar Night, and the yearly cycle repeats.
So, perhaps understandably, many people had a hard time relating when I told them I was moving there.
“I could never live there,” was the most common response I heard. “That winter would make me so depressed,” many added, or “I just get so tired when it’s dark out.”
But the Polar Night was what drew me to Tromsø in the first place.