Europe's arbitrary post-colonial borders left Africans bunched into countries that don't represent their heritage, a contradiction that still troubles them today.
South Sudanese officials look at the newly unveiled map of Sudan after separation. (Reuters)
When the nations of Nigeria and Cameroon went to settle a border dispute in 2002, in which both countries claimed an oil-rich peninsula about the size of El Paso, they didn't cite ancient cultural claims to the land, nor the preferences of its inhabitants, nor even their own national interests. Rather, in taking their case to the International Court of Justice, they cited a pile of century-old European paperwork.
Cameroon was once a German colony and Nigeria had been ruled by the British empire; in 1913, the two European powers had negotiated the border between these West African colonies. Cameroon argued that this agreement put the peninsula within their borders. Nigeria said the same. Cameroon's yellowed maps were apparently more persuasive; it won the case, and will officially absorb the Bekassi Peninsula into its borders next month.
The case, as Reuters once explained, "again highlighted Africa's commitment to colonial borders drawn without consideration for those actually living there." African borders, in this thinking, are whatever Europeans happened to have marked down during the 19th and 20th centuries, which is a surprising way to do things given how little these outsider-drawn borders have to do with actual Africans.
In much of the world, national borders have shifted over time to reflect ethnic, linguistic, and sometimes religious divisions. Spain's borders generally enclose the Spanish-speakers of Europe; Slovenia and Croatia roughly encompass ethnic Slovenes and Croats. Thailand is exactly what its name suggests. Africa is different, its nations largely defined not by its peoples heritage but by the follies of European colonialism. But as the continent becomes more democratic and Africans assert desires for national self-determination, the African insistance on maintaining colonial-era borders is facing more popular challenges, further exposing the contradiction engineered into African society half a century ago.
When European colonialism collapsed in the years after World War Two and Africans resumed control of their own continent, sub-Saharan leaders agreed to respect the colonial borders. Not because those borders made any sense -- they are widely considered the arbitrary creations of colonial happenstance and European agreements -- but because "new rulers in Africa made the decision to keep the borders drawn by former colonizers to avoid disruptive conflict amongst themselves," as a Harvard paper on these "artificial states" put it.
Conflict has decreased in Africa since the turbulent 1960s and '70s, and though the continent still has some deeply troubled hotspots, the broader trend in Africa is one of peace, democracy, and growth. The threats of destabilizing war, of coups and counter-coups, have eased since the first independent African leaders pledged to uphold European-drawn borders. But a contradiction remains in the African system: leaders are committed to maintaining consistent borders, and yet as those governments become more democratic, they have to confront the fact that popular will might conflict.
A Kenyan group called the Mombasa Republican Council is just the latest of Africa's now 20-plus separatist movements, according to the Guardian, which has charted them all in an interactive map. The Mombasa group wants the country's coastal region to secede, citing its distinct heritage due to centuries of trade across the Indian Ocean. It's unlikely to happen, but as the Guardian notes it's part of a trend of "encouraged" separatist movements as Africans seem to become more willing and interested in pursuing borders that more closely reflect the continent's diverse ethnic, religious, and linguistic lines.
Consider Angola. In 1575, 100 Portugese families and 400 Portugese troops landed on the African continent's southwestern coast at what is now the city of Luanda. They expanded from there, stopping only when they reached German, Belgian, or British claims. The Portugese consolidated the vast, California-sized holdings into a single colony. The only thing that the people who lived there shared in common was that they answered to Portugese masters, and in 1961 that they rebelled against that rule, which they threw off in 1975. They became the country of Angola, an essentially invented nation meant to represent disparate and ancient cultures as if they had simply materialized out of thin air that very moment. Today, as some Angolans are quick to point out, their country is composed of ten major ethnic groups, who do not necessarily have a history of or an interest in shared nationhood. This may help explain why there are two secessionist groups in Angola today.
Had pre-industrial-era Portugese colonists not pressed so far up along Africa's western coast so quickly, for example, then Africa's seven million Kikongo-speakers might today have their own country. Instead, they are split among three different countries, including Angola, as minorities. The Bundu dia Kongo separatist group, which operates across the region, wants to establish a country that would more closely resemble the old, pre-colonial Kongo Kingdom, and give the Kikongo-speakers a country.
There's no reason to think that Bundia dia Kongo or the Mombasa Republican Council have any chance at establishing sovereign states; their movements are too weak and the states they challenge are too strong. But, as the 2011 division of Sudan into two countries demonstrated, the world can sometimes find some flexibility in the unofficial rule about maintaining colonial African borders. Sudan was an extreme example, an infamously poorly demarcated state that encompassed some of the widest ethnic and religious gulfs in the world, but as G. Pascal Zachary wrote in TheAtlantic.com at the time, it provided an opportunity to question whether those arbitrary borders hold Africa back. After all, in countries such as Nigeria or the Democratic Republic of Congo, disparate cultural groups have tended to band together, competing with one another for finite power and resources, sometimes disastrously. With tribal identities strong and national identities weak (after all, the latter tends to be ancient and deeply rooted, the latter new and artificial), national cooperation can be tough.
Of course, the actual practice of secession and division would be difficult, if it's even functionally possible; Africa's ethnic groups are many, and they don't tend to fall along the cleanest possible lines. The debate over whether or not secession is good for Africa, as Zachary explained, is a complicated and sometimes contentious one. But the simple fact of this debate is a reminder of Africa's unique post-colonial borders, a devil's bargain sacrificing the democratic fundamental of national self-determination for the practical pursuits of peace and independence. And it's another indication of the many ways that colonialism's complicated legacy is still with us, still shaping today's world.
Some fans are complaining that Zack Snyder’s envisioning of the Man of Steel is too grim—but it’s less a departure than a return to the superhero’s roots.
Since the official teaser trailer for Batman v Superman: Dawn of Justice debuted online in April, fans and critics alike have been discussing the kind of Superman Zack Snyder is going to depict in his Man of Steel sequel. The controversy stems from Snyder’s decision to cast Superman as a brooding, Dark Knight-like character, who cares more about beating up bad guys than saving people. The casting split has proved divisive among Superman fans: Some love the new incarnation, citing him as an edgier, more realistic version of the character.
But Snyder’s is a different Superman than the one fans grew up with, and many have no problem expressing their outrage over it. Even Mark Waid, the author of Superman: Birthright (one of the comics the original film is based on), voiced his concern about Man of Steel’s turn toward bleakness when it came out in 2013:
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
19 Kids and Counting built its reputation on preaching family values, but the mass-media platforms that made the family famous might also be their undoing.
On Thursday, news broke that Josh Duggar, the oldest son of the Duggar family's 19 children, had, as a teenager, allegedly molested five underage girls. Four of them, allegedly, were his sisters.
The information came to light because, in 2006—two years before 17 Kids and Counting first aired on TLC, and thus two years before the Duggars became reality-TV celebrities—the family recorded an appearance on TheOprah Winfrey Show. Before the taping, an anonymous source sent an email to Harpo warning the production company Josh’s alleged molestation. Harpo forwarded the email to authorities, triggering a police investigation (the Oprah appearance never aired). The news was reported this week by In Touch Weekly—after the magazine filed a Freedom of Information Act request to see the police report on the case—and then confirmed by the Duggars in a statement posted on Facebook.
In an interview, the U.S. president ties his legacy to a pact with Tehran, argues ISIS is not winning, warns Saudi Arabia not to pursue a nuclear-weapons program, and anguishes about Israel.
On Tuesday afternoon, as President Obama was bringing an occasionally contentious but often illuminating hour-long conversation about the Middle East to an end, I brought up a persistent worry. “A majority of American Jews want to support the Iran deal,” I said, “but a lot of people are anxiety-ridden about this, as am I.” Like many Jews—and also, by the way, many non-Jews—I believe that it is prudent to keep nuclear weapons out of the hands of anti-Semitic regimes. Obama, who earlier in the discussion had explicitly labeled the supreme leader of Iran, Ayatollah Ali Khamenei, an anti-Semite, responded with an argument I had not heard him make before.
“Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said, referring to the apparently almost-finished nuclear agreement between Iran and a group of world powers led by the United States. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
Advocates say that a guaranteed basic income can lead to more creative, fulfilling work. The question is how to fund it.
Scott Santens has been thinking a lot about fish lately. Specifically, he’s been reflecting on the aphorism, “If you give a man a fish, he eats for a day. If you teach a man to fish, he eats for life.” What Santens wants to know is this: “If you build a robot to fish, do all men starve, or do all men eat?”
Santens is 37 years old, and he’s a leader in the basic income movement—a worldwide network of thousands of advocates (26,000 on Reddit alone) who believe that governments should provide every citizen with a monthly stipend big enough to cover life’s basic necessities. The idea of a basic income has been around for decades, and it once drew support from leaders as different as Martin Luther King Jr. and Richard Nixon. But rather than waiting for governments to act, Santens has started crowdfunding his own basic income of $1,000 per month. He’s nearly halfway to his his goal.
Why agriculture may someday take place in towers, not fields
A couple of Octobers ago, I found myself standing on a 5,000-acre cotton crop in the outskirts of Lubbock, Texas, shoulder-to-shoulder with a third-generation cotton farmer. He swept his arm across the flat, brown horizon of his field, which was at that moment being plowed by an industrial-sized picker—a toothy machine as tall as a house and operated by one man. The picker’s yields were being dropped into a giant pod to be delivered late that night to the local gin. And far beneath our feet, the Ogallala aquifer dwindled away at its frighteningly swift pace. When asked about this, the farmer spoke of reverse osmosis—the process of desalinating water—which he seemed to put his faith in, and which kept him unafraid of famine and permanent drought.
People who wear and design prosthetics are rethinking the form of our species.
When Elizabeth Wright smacks her right leg on a table, she says “ow.” That’s only interesting if you know one more thing: that her right leg is made out of carbon fiber and metal. It’s also part of her. “It is my right leg, just as my left leg is my left leg, and just as your right leg is your right leg.”
Wright was born with something called congenital limb deficiency—neither her right arm or right leg grew to their full length in the womb. At 2 years old, she was fitted with a prosthetic leg, something she describes as “a revelation.” Around the time she was 6 years old the doctors decided it was time for her to try a prosthetic arm. That didn’t go as well. “This was in the 80s,” Wright says, “before the fancy hands you can use to pick up eggs and not break them. The arm that I got it was purely for aesthetic reasons, it just hung there like some kind of weird dead arm, and I couldn’t do anything with it. I could actually do less. So I think it lasted two or three days and then it got relegated to the cupboard. I refused to wear it.” And it stayed there. Today, Wright still uses a prosthetic leg, one that is wholly hers, entirely a part of her identity, and she still rejects the use of a prosthetic arm. She says she’s learned how to do things without it.
No police officers will serve time for the November 2012 shooting death of two unarmed black civilians.
On November 29, 2012, police officers and witnesses heard what appeared to be gunshots coming from a car driving near a police station in Cleveland. A high-speed car chase ensued, drawing in over 100 officers on duty, before the police managed to corner the car. Thirteen police officers then fired 137 rounds of ammunition at the vehicle, whose occupants Cleveland police suspected were armed. After the other officers stopped firing, 31-year-old Michael Brelo climbed on top of the hood of the suspect’s car and fired 15 more rounds at close range. When the shooting stopped, the car’s occupants, 43-year-old Timothy Russell and 30-year-old Malissa Williams, were dead. Both were unarmed. The “gunshot” witnesses heard turned out to be a backfiring car.