Europe's arbitrary post-colonial borders left Africans bunched into countries that don't represent their heritage, a contradiction that still troubles them today.
South Sudanese officials look at the newly unveiled map of Sudan after separation. (Reuters)
When the nations of Nigeria and Cameroon went to settle a border dispute in 2002, in which both countries claimed an oil-rich peninsula about the size of El Paso, they didn't cite ancient cultural claims to the land, nor the preferences of its inhabitants, nor even their own national interests. Rather, in taking their case to the International Court of Justice, they cited a pile of century-old European paperwork.
Cameroon was once a German colony and Nigeria had been ruled by the British empire; in 1913, the two European powers had negotiated the border between these West African colonies. Cameroon argued that this agreement put the peninsula within their borders. Nigeria said the same. Cameroon's yellowed maps were apparently more persuasive; it won the case, and will officially absorb the Bekassi Peninsula into its borders next month.
The case, as Reuters once explained, "again highlighted Africa's commitment to colonial borders drawn without consideration for those actually living there." African borders, in this thinking, are whatever Europeans happened to have marked down during the 19th and 20th centuries, which is a surprising way to do things given how little these outsider-drawn borders have to do with actual Africans.
In much of the world, national borders have shifted over time to reflect ethnic, linguistic, and sometimes religious divisions. Spain's borders generally enclose the Spanish-speakers of Europe; Slovenia and Croatia roughly encompass ethnic Slovenes and Croats. Thailand is exactly what its name suggests. Africa is different, its nations largely defined not by its peoples heritage but by the follies of European colonialism. But as the continent becomes more democratic and Africans assert desires for national self-determination, the African insistance on maintaining colonial-era borders is facing more popular challenges, further exposing the contradiction engineered into African society half a century ago.
When European colonialism collapsed in the years after World War Two and Africans resumed control of their own continent, sub-Saharan leaders agreed to respect the colonial borders. Not because those borders made any sense -- they are widely considered the arbitrary creations of colonial happenstance and European agreements -- but because "new rulers in Africa made the decision to keep the borders drawn by former colonizers to avoid disruptive conflict amongst themselves," as a Harvard paper on these "artificial states" put it.
Conflict has decreased in Africa since the turbulent 1960s and '70s, and though the continent still has some deeply troubled hotspots, the broader trend in Africa is one of peace, democracy, and growth. The threats of destabilizing war, of coups and counter-coups, have eased since the first independent African leaders pledged to uphold European-drawn borders. But a contradiction remains in the African system: leaders are committed to maintaining consistent borders, and yet as those governments become more democratic, they have to confront the fact that popular will might conflict.
A Kenyan group called the Mombasa Republican Council is just the latest of Africa's now 20-plus separatist movements, according to the Guardian, which has charted them all in an interactive map. The Mombasa group wants the country's coastal region to secede, citing its distinct heritage due to centuries of trade across the Indian Ocean. It's unlikely to happen, but as the Guardian notes it's part of a trend of "encouraged" separatist movements as Africans seem to become more willing and interested in pursuing borders that more closely reflect the continent's diverse ethnic, religious, and linguistic lines.
Consider Angola. In 1575, 100 Portugese families and 400 Portugese troops landed on the African continent's southwestern coast at what is now the city of Luanda. They expanded from there, stopping only when they reached German, Belgian, or British claims. The Portugese consolidated the vast, California-sized holdings into a single colony. The only thing that the people who lived there shared in common was that they answered to Portugese masters, and in 1961 that they rebelled against that rule, which they threw off in 1975. They became the country of Angola, an essentially invented nation meant to represent disparate and ancient cultures as if they had simply materialized out of thin air that very moment. Today, as some Angolans are quick to point out, their country is composed of ten major ethnic groups, who do not necessarily have a history of or an interest in shared nationhood. This may help explain why there are two secessionist groups in Angola today.
Had pre-industrial-era Portugese colonists not pressed so far up along Africa's western coast so quickly, for example, then Africa's seven million Kikongo-speakers might today have their own country. Instead, they are split among three different countries, including Angola, as minorities. The Bundu dia Kongo separatist group, which operates across the region, wants to establish a country that would more closely resemble the old, pre-colonial Kongo Kingdom, and give the Kikongo-speakers a country.
There's no reason to think that Bundia dia Kongo or the Mombasa Republican Council have any chance at establishing sovereign states; their movements are too weak and the states they challenge are too strong. But, as the 2011 division of Sudan into two countries demonstrated, the world can sometimes find some flexibility in the unofficial rule about maintaining colonial African borders. Sudan was an extreme example, an infamously poorly demarcated state that encompassed some of the widest ethnic and religious gulfs in the world, but as G. Pascal Zachary wrote in TheAtlantic.com at the time, it provided an opportunity to question whether those arbitrary borders hold Africa back. After all, in countries such as Nigeria or the Democratic Republic of Congo, disparate cultural groups have tended to band together, competing with one another for finite power and resources, sometimes disastrously. With tribal identities strong and national identities weak (after all, the latter tends to be ancient and deeply rooted, the latter new and artificial), national cooperation can be tough.
Of course, the actual practice of secession and division would be difficult, if it's even functionally possible; Africa's ethnic groups are many, and they don't tend to fall along the cleanest possible lines. The debate over whether or not secession is good for Africa, as Zachary explained, is a complicated and sometimes contentious one. But the simple fact of this debate is a reminder of Africa's unique post-colonial borders, a devil's bargain sacrificing the democratic fundamental of national self-determination for the practical pursuits of peace and independence. And it's another indication of the many ways that colonialism's complicated legacy is still with us, still shaping today's world.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The 2016 Sony World Photography Awards are now taking entries, and the organizers have been kind enough to share some of their early entries with us.
The 2016 Sony World Photography Awards are now taking entries, and the organizers have been kind enough to share some of their early entries with us, gathered below. Last year’s competition attracted over 173,000 entries from 171 countries. Entries will be accepted until May 1, 2016. All captions below come from the photographers.
A recent Brookings study suggests that brains and drive have more to do with lifelong success than family wealth. But there's a big catch.
I know, I know, you'd rather be born smart and rich (and charming, and with a lustrous head of hair, and a voice like Michael Bolton's). But if you had to choose? Chances are, your answer depends on whether you think the U.S. economy is a meritocracy—that intelligence and ambition are more important to lifelong success than the circumstances of your birth.
A recent Brookings paper gives reasons for optimism. Over the long term, it finds, smart kids earn more than rich kids. But sadly, there's a big catch.
The Brookings paper looked at the relationship between brains, motivation, and economic mobility among a group of youth the government began tracking in 1979. Here's the executive summary: If they were bright and driven, poor kids stood a decent chance of becoming upper-middle-class, or better. Of low-income teens who scored in the top third of test-takers on the Armed Forces Qualification Test (on the far left in green), more than 40 percent made it to the top two income quintiles by adulthood. Meanwhile, dimwitted children of affluence generally fell down the economic ladder. Among high-income teens who scored in the bottom third of AFQT takers (on the far right in orange), more than half ended up in the bottom two income quintiles.
The Republican frontrunner has surged in the polls by taking a tough stance on immigration—and if critics want to stop him, that’s what they need to attack.
A new round of attack ads are heading Donald Trump’s way, some from John Kasich’s campaign and the super PAC backing him, and more in the future from an LLC created specifically to produce anti-Trump messages.
New Day for America’s 47-second ad splices together some of the Republican front-runner’s most awkward video moments: his suggestion he might date his daughter, his claim of “a great relationship with the blacks.” The Kasich campaign’s ad turns Martin Niemöller’s famous words “nobody left to speak for me” into a warning from one of John McCain’s fellow Hanoi Hilton POWs that a Trump presidency is a threat to freedom.* John Kasich’s Twitter account has fired direct personal challenges to the famously thin-skinned mogul.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
Nobody’s focused on winning the peace. That’s a big problem.
In August 1941, Winston Churchill and Franklin Roosevelt met off the coast of Newfoundland to outline a shared vision for the post-World War II era. The British prime minister was so thrilled to see the American president that, in the words of one official, “You’d have thought he was being carried up into the heavens to meet God.” The two countries issued the Atlantic Charter, which sought “a better future for the world” through the principles of self-determination, collective security, and free trade. The United States hadn’t even entered the war yet, but it was already focused on winning the peace. The endgame was not just the defeat of the Axis powers, but also the creation of a stable global order, in which World War II would be the last world war.
An entire industry has been built on the premise that creating gourmet meals at home is simple and effortless. But it isn’t true.
I write about food for a living. Because of this, I spend more time than the average American surrounded by cooking advice and recipes. I’m also a mother, which means more often than not, when I return from work 15 minutes before bedtime, I end up feeding my 1-year-old son squares of peanut-butter toast because there was nothing in the fridge capable of being transformed into a wholesome, homemade toddler meal in a matter of minutes. Every day, when I head to my office after a nourishing breakfast of smashed blueberries or oatmeal I found stuck to the pan, and open a glossy new cookbook, check my RSS feed, or page through a stack of magazines, I’m confronted by an impenetrable wall of unimaginable cooking projects, just sitting there pretending to be totally reasonable meals. Homemade beef barbacoa tacos. Short-rib potpie. “Weekday” French toast. Make-ahead coconut cake. They might as well be skyscraper blueprints, so improbable is the possibility that I will begin making my own nut butters, baking my own sandwich bread, or turning that fall farmer’s market bounty into jars of homemade applesauce.
CRISPR can finally tell us which human genes are essential—and which matter specifically to cancer cells.
Humans have between 20,000 and 25,000 genes, but which of these really matter? Which are essential, and which are merely optional add-ons?
It’s crazy to me that we still don't know, even though it’s been almost 15 years since the first draft of the human genome was published. Partly, the problem is a technological one. The best way of working out if a gene is essential is to disable it and see what happens, and “we just didn’t have a good way of systematically manipulating genes in humans cells,” says Jason Moffat from the University of Toronto. Sure, scientists have been able to tinker with individual genes, but working through them all, and knocking them out one by one, has been nigh-on impossible.