Europe's arbitrary post-colonial borders left Africans bunched into countries that don't represent their heritage, a contradiction that still troubles them today.
South Sudanese officials look at the newly unveiled map of Sudan after separation. (Reuters)
When the nations of Nigeria and Cameroon went to settle a border dispute in 2002, in which both countries claimed an oil-rich peninsula about the size of El Paso, they didn't cite ancient cultural claims to the land, nor the preferences of its inhabitants, nor even their own national interests. Rather, in taking their case to the International Court of Justice, they cited a pile of century-old European paperwork.
Cameroon was once a German colony and Nigeria had been ruled by the British empire; in 1913, the two European powers had negotiated the border between these West African colonies. Cameroon argued that this agreement put the peninsula within their borders. Nigeria said the same. Cameroon's yellowed maps were apparently more persuasive; it won the case, and will officially absorb the Bekassi Peninsula into its borders next month.
The case, as Reuters once explained, "again highlighted Africa's commitment to colonial borders drawn without consideration for those actually living there." African borders, in this thinking, are whatever Europeans happened to have marked down during the 19th and 20th centuries, which is a surprising way to do things given how little these outsider-drawn borders have to do with actual Africans.
In much of the world, national borders have shifted over time to reflect ethnic, linguistic, and sometimes religious divisions. Spain's borders generally enclose the Spanish-speakers of Europe; Slovenia and Croatia roughly encompass ethnic Slovenes and Croats. Thailand is exactly what its name suggests. Africa is different, its nations largely defined not by its peoples heritage but by the follies of European colonialism. But as the continent becomes more democratic and Africans assert desires for national self-determination, the African insistance on maintaining colonial-era borders is facing more popular challenges, further exposing the contradiction engineered into African society half a century ago.
When European colonialism collapsed in the years after World War Two and Africans resumed control of their own continent, sub-Saharan leaders agreed to respect the colonial borders. Not because those borders made any sense -- they are widely considered the arbitrary creations of colonial happenstance and European agreements -- but because "new rulers in Africa made the decision to keep the borders drawn by former colonizers to avoid disruptive conflict amongst themselves," as a Harvard paper on these "artificial states" put it.
Conflict has decreased in Africa since the turbulent 1960s and '70s, and though the continent still has some deeply troubled hotspots, the broader trend in Africa is one of peace, democracy, and growth. The threats of destabilizing war, of coups and counter-coups, have eased since the first independent African leaders pledged to uphold European-drawn borders. But a contradiction remains in the African system: leaders are committed to maintaining consistent borders, and yet as those governments become more democratic, they have to confront the fact that popular will might conflict.
A Kenyan group called the Mombasa Republican Council is just the latest of Africa's now 20-plus separatist movements, according to the Guardian, which has charted them all in an interactive map. The Mombasa group wants the country's coastal region to secede, citing its distinct heritage due to centuries of trade across the Indian Ocean. It's unlikely to happen, but as the Guardian notes it's part of a trend of "encouraged" separatist movements as Africans seem to become more willing and interested in pursuing borders that more closely reflect the continent's diverse ethnic, religious, and linguistic lines.
Consider Angola. In 1575, 100 Portugese families and 400 Portugese troops landed on the African continent's southwestern coast at what is now the city of Luanda. They expanded from there, stopping only when they reached German, Belgian, or British claims. The Portugese consolidated the vast, California-sized holdings into a single colony. The only thing that the people who lived there shared in common was that they answered to Portugese masters, and in 1961 that they rebelled against that rule, which they threw off in 1975. They became the country of Angola, an essentially invented nation meant to represent disparate and ancient cultures as if they had simply materialized out of thin air that very moment. Today, as some Angolans are quick to point out, their country is composed of ten major ethnic groups, who do not necessarily have a history of or an interest in shared nationhood. This may help explain why there are two secessionist groups in Angola today.
Had pre-industrial-era Portugese colonists not pressed so far up along Africa's western coast so quickly, for example, then Africa's seven million Kikongo-speakers might today have their own country. Instead, they are split among three different countries, including Angola, as minorities. The Bundu dia Kongo separatist group, which operates across the region, wants to establish a country that would more closely resemble the old, pre-colonial Kongo Kingdom, and give the Kikongo-speakers a country.
There's no reason to think that Bundia dia Kongo or the Mombasa Republican Council have any chance at establishing sovereign states; their movements are too weak and the states they challenge are too strong. But, as the 2011 division of Sudan into two countries demonstrated, the world can sometimes find some flexibility in the unofficial rule about maintaining colonial African borders. Sudan was an extreme example, an infamously poorly demarcated state that encompassed some of the widest ethnic and religious gulfs in the world, but as G. Pascal Zachary wrote in TheAtlantic.com at the time, it provided an opportunity to question whether those arbitrary borders hold Africa back. After all, in countries such as Nigeria or the Democratic Republic of Congo, disparate cultural groups have tended to band together, competing with one another for finite power and resources, sometimes disastrously. With tribal identities strong and national identities weak (after all, the latter tends to be ancient and deeply rooted, the latter new and artificial), national cooperation can be tough.
Of course, the actual practice of secession and division would be difficult, if it's even functionally possible; Africa's ethnic groups are many, and they don't tend to fall along the cleanest possible lines. The debate over whether or not secession is good for Africa, as Zachary explained, is a complicated and sometimes contentious one. But the simple fact of this debate is a reminder of Africa's unique post-colonial borders, a devil's bargain sacrificing the democratic fundamental of national self-determination for the practical pursuits of peace and independence. And it's another indication of the many ways that colonialism's complicated legacy is still with us, still shaping today's world.
John Dowd is the president’s second personal lawyer to leave the job and it’s the second major change to his legal team this week.
Updated on March 22 at 1:15 p.m.
John Dowd announced he will depart his position as President Trump’s lead personal lawyer in the Russia investigation, the second person to leave that job in less than a year.
Dowd announced his exit late Thursday morning. The specifics of the decision remain obscure—The Washington Post described it, somewhat paradoxically, as “a largely mutual decision”—but the departure comes amid rising frustration from the president with his legal team and frustration from the legal team over Trump’s refusal to follow advice. Dowd had been a particularly strong voice arguing against Trump testifying to Special Counsel Robert Mueller, and over the weekend Dowd called for Mueller’s firing, initially telling The Daily Beast he spoke for the president, then later insisting he spoke only for himself.
The Cambridge Analytica scandal is drawing attention to malicious data thieves and brokers. But every Facebook app—even the dumb, innocent ones—collected users’ personal data without even trying.
For a spell during 2010 and 2011, I was a virtual rancher of clickable cattle on Facebook.
It feels like a long time ago. Obama was serving his first term as president. Google+ hadn’t arrived, let alone vanished again. Steve Jobs was still alive, as was Kim Jong Il. Facebook’s IPO hadn’t yet taken place, and its service was still fun to use—although it was littered with requests and demands from social games, like FarmVille and Pet Society.
I’d had enough of it—the click-farming games, for one, but also Facebook itself. Already in 2010, it felt like a malicious attention market where people treated friends as latent resources to be optimized. Compulsion rather than choice devoured people’s time. Apps like FarmVille sold relief for the artificial inconveniences they themselves had imposed.
How sugar daddies and vaginal microbes created the world’s largest HIV epidemic
VULINDLELA, South Africa—Mbali N. was just 17 when a well-dressed man in his 30s spotted her. She was at a mall in a nearby town, alone, when he called out. He might have been captivated by her almond eyes and soaring cheekbones. Or he might have just seen her for what she was: young and poor.
She tried to ignore him, she told me, but he followed her. They exchanged numbers. By the time she got home, he had called her. He said he wasn’t married, and she doesn’t know if that was true. They met at a house in a different township; she doesn’t know if it belonged to him. Mbali, who is now 24, also doesn’t know if he had HIV.
She enjoyed spending time with the man during the day, when they would talk and go to the movies. But she didn’t like it when he called at night and demanded to have sex, which happened about six times a month. When she refused him, he beat her. For her trouble, he gave her a cellphone, sweets, and chocolates.
Party leadership is sending an unmistakable signal to voters: So long as Republicans hold the congressional majority, they will not act to meaningfully constrain, or even oversee, the president.
Every time Donald Trump breaks a window, congressional Republicans obediently sweep up the glass.
That’s become one of the most predictable patterns of his turbulent presidency—and a defining dynamic of the approaching midterm elections. Each time they overtly defend his behavior, or implicitly excuse him by failing to object, they bind themselves to him more tightly.
It happened again last weekend when Trump fired off a volley of tweets that, for the first time, attacked Special Counsel Robert Mueller by name. A handful of GOP senators responded with warnings against dismissing Mueller. More congressional Republicans said nothing. Party leaders, such as House Speaker Paul Ryan, tried to downplay the attacks by insisting that Trump would not act on them and fire Mueller, who is investigating Russian interference in the 2016 presidential election. Most important, and regardless of their rhetorical posture, Republicans almost universally locked arms to reject legislative action to protect the special counsel.
FBI employees are required to adhere to an ethical standard that includes an affirmative duty to offer relevant information to internal investigators.
When Attorney General Jeff Sessions fired former FBI Deputy Director Andrew McCabe last week, just hours short of McCabe’s retirement, he cited an internal FBI investigation that concluded McCabe “lacked candor” in his conversations with investigators when asked about disclosures to the media during the 2016 election.
But what does that actually mean?
“Lack of candor is untruthfulness or an attempt to dissemble from the point of view of the investigator,” said Dave Gomez, a former FBI agent and a senior fellow at George Washington University’s Center for Cyber and Homeland Security. “The problem comes when, in answering a question, the person under investigation attempts to spin his answer in order to present his actions in the best possible light. This is normal human behavior, but can be interpreted as a lack of candor by the investigator.”
Schools are moving toward a model of continuous, lifelong learning in order to meet the needs of today’s economy.
When the giant Indian technology-services firm Infosys announced last November that it would open a design and innovation hub in Providence, the company’s president said one of the key reasons he chose Rhode Island was its strong network of higher-education institutions: Brown University, the Rhode Island School of Design, and the Community College of Rhode Island.
In a higher-education system that is often divided between two- and four-year colleges and further segregated between elite and nonelite institutions, it’s not often that a community college is mentioned in the same breath as an Ivy League campus. Nor is a two-year college seen as a training ground for jobs in the so-called creative economy, which include industries such as design, fashion, and computer gaming that typically require bachelor’s degrees.
They’re both blamed for predisposing their members to violent acts, but they’ve sparked radically different public-policy responses.
When I thought about locking up with a crew in 1996, I wanted to see a full initiation first, not parts I stumbled upon over the years. My friend Cliff and I arrived at a park not close from my home in Jamaica, Queens. Leaves danced with the wind around our feet, wafting an eerie feeling in my 14-year-old black body. The grounds of the initiation beckoned: a high-rise chain link fence, enclosing two basketball courts.
Through the daylighted chain, I watched scowls and punches and stomps engulf the uninitiated teen—a stoppage, then an awkward transition into hugs, handshakes, and smiles. The striking contrast shot at my core of authenticity, the insincerity of the punch-hug, of the stomp-smile, murdering my thoughts of joining a crew.
“I thought we would at least know what was going to happen to us.”
At first, Elena Remigi thought getting British citizenship would be a formality. Though she was born in Milan, she had lived in the United Kingdom for more than a decade. She owned a house, she had a car, and she even got permanent residency—an arduous process that involves filling out an 85-page application and providing a stack of documents to prove eligibility. But after Britons voted in June 2016 for the U.K. to leave the European Union, she thought the long and expensive process to get a British passport would be worth it.
It was so easy before. In much the same way an American from, say, Nebraska, could pick up and move to New York without having to go through an immigration process, let alone change citizenship, one point of the European Union was to give all European citizens the same kinds of rights to live and work anywhere in Europe. Moving from Milan to London was a lot like moving from Omaha to Ithaca. Except it’s not anymore—but nobody’s exactly sure yet what it’s supposed to be like.
Can TV comedies about murder move beyond the punchline?
The first murder in Barry happens offscreen. Barry (Bill Hader) walks robotically out of a hotel bathroom toward a bed where—the camera pans to reveal—a man has been shot in the head. Barry removes the silencer from a revolver and grimaces, slightly, as if he has indigestion. He pats himself down to check he hasn’t forgotten anything, looks at his watch, and leaves the room.
The premise of Bill Hader and Alec Berg’s new eight-part HBOcomedy is that Barry is an assassin, but a reluctant one. Essentially gentle and conflict-averse deep down, he’s eager to hang up his weapons and try something new. As he explains to an acting coach, Gene Cousineau (Henry Winkler), he’s a former Marine who came back from Afghanistan with crippling depression and no direction, until a friend of his dad’s back in the midwest pointed out the one job his particular set of skills made him suitable for. But, Barry says, forlornly, “I know there’s more to me than that.”
How evangelicals, once culturally confident, became an anxious minority seeking political protection from the least traditionally religious president in living memory
One of the most extraordinary things about our current politics—really, one of the most extraordinary developments of recent political history—is the loyal adherence of religious conservatives to Donald Trump. The president won four-fifths of the votes of white evangelical Christians. This was a higher level of support than either Ronald Reagan or George W. Bush, an outspoken evangelical himself, ever received.
Trump’s background and beliefs could hardly be more incompatible with traditional Christian models of life and leadership. Trump’s past political stances (he once supported the right to partial-birth abortion), his character (he has bragged about sexually assaulting women), and even his language (he introduced the words pussy and shithole into presidential discourse) would more naturally lead religious conservatives toward exorcism than alliance. This is a man who has cruelly publicized his infidelities, made disturbing sexual comments about his elder daughter, and boasted about the size of his penis on the debate stage. His lawyer reportedly arranged a $130,000 payment to a porn star to dissuade her from disclosing an alleged affair. Yet religious conservatives who once blanched at PG-13 public standards now yawn at such NC-17 maneuvers. We are a long way from The Book of Virtues.