Europe's arbitrary post-colonial borders left Africans bunched into countries that don't represent their heritage, a contradiction that still troubles them today.
South Sudanese officials look at the newly unveiled map of Sudan after separation. (Reuters)
When the nations of Nigeria and Cameroon went to settle a border dispute in 2002, in which both countries claimed an oil-rich peninsula about the size of El Paso, they didn't cite ancient cultural claims to the land, nor the preferences of its inhabitants, nor even their own national interests. Rather, in taking their case to the International Court of Justice, they cited a pile of century-old European paperwork.
Cameroon was once a German colony and Nigeria had been ruled by the British empire; in 1913, the two European powers had negotiated the border between these West African colonies. Cameroon argued that this agreement put the peninsula within their borders. Nigeria said the same. Cameroon's yellowed maps were apparently more persuasive; it won the case, and will officially absorb the Bekassi Peninsula into its borders next month.
The case, as Reuters once explained, "again highlighted Africa's commitment to colonial borders drawn without consideration for those actually living there." African borders, in this thinking, are whatever Europeans happened to have marked down during the 19th and 20th centuries, which is a surprising way to do things given how little these outsider-drawn borders have to do with actual Africans.
In much of the world, national borders have shifted over time to reflect ethnic, linguistic, and sometimes religious divisions. Spain's borders generally enclose the Spanish-speakers of Europe; Slovenia and Croatia roughly encompass ethnic Slovenes and Croats. Thailand is exactly what its name suggests. Africa is different, its nations largely defined not by its peoples heritage but by the follies of European colonialism. But as the continent becomes more democratic and Africans assert desires for national self-determination, the African insistance on maintaining colonial-era borders is facing more popular challenges, further exposing the contradiction engineered into African society half a century ago.
When European colonialism collapsed in the years after World War Two and Africans resumed control of their own continent, sub-Saharan leaders agreed to respect the colonial borders. Not because those borders made any sense -- they are widely considered the arbitrary creations of colonial happenstance and European agreements -- but because "new rulers in Africa made the decision to keep the borders drawn by former colonizers to avoid disruptive conflict amongst themselves," as a Harvard paper on these "artificial states" put it.
Conflict has decreased in Africa since the turbulent 1960s and '70s, and though the continent still has some deeply troubled hotspots, the broader trend in Africa is one of peace, democracy, and growth. The threats of destabilizing war, of coups and counter-coups, have eased since the first independent African leaders pledged to uphold European-drawn borders. But a contradiction remains in the African system: leaders are committed to maintaining consistent borders, and yet as those governments become more democratic, they have to confront the fact that popular will might conflict.
A Kenyan group called the Mombasa Republican Council is just the latest of Africa's now 20-plus separatist movements, according to the Guardian, which has charted them all in an interactive map. The Mombasa group wants the country's coastal region to secede, citing its distinct heritage due to centuries of trade across the Indian Ocean. It's unlikely to happen, but as the Guardian notes it's part of a trend of "encouraged" separatist movements as Africans seem to become more willing and interested in pursuing borders that more closely reflect the continent's diverse ethnic, religious, and linguistic lines.
Consider Angola. In 1575, 100 Portugese families and 400 Portugese troops landed on the African continent's southwestern coast at what is now the city of Luanda. They expanded from there, stopping only when they reached German, Belgian, or British claims. The Portugese consolidated the vast, California-sized holdings into a single colony. The only thing that the people who lived there shared in common was that they answered to Portugese masters, and in 1961 that they rebelled against that rule, which they threw off in 1975. They became the country of Angola, an essentially invented nation meant to represent disparate and ancient cultures as if they had simply materialized out of thin air that very moment. Today, as some Angolans are quick to point out, their country is composed of ten major ethnic groups, who do not necessarily have a history of or an interest in shared nationhood. This may help explain why there are two secessionist groups in Angola today.
Had pre-industrial-era Portugese colonists not pressed so far up along Africa's western coast so quickly, for example, then Africa's seven million Kikongo-speakers might today have their own country. Instead, they are split among three different countries, including Angola, as minorities. The Bundu dia Kongo separatist group, which operates across the region, wants to establish a country that would more closely resemble the old, pre-colonial Kongo Kingdom, and give the Kikongo-speakers a country.
There's no reason to think that Bundia dia Kongo or the Mombasa Republican Council have any chance at establishing sovereign states; their movements are too weak and the states they challenge are too strong. But, as the 2011 division of Sudan into two countries demonstrated, the world can sometimes find some flexibility in the unofficial rule about maintaining colonial African borders. Sudan was an extreme example, an infamously poorly demarcated state that encompassed some of the widest ethnic and religious gulfs in the world, but as G. Pascal Zachary wrote in TheAtlantic.com at the time, it provided an opportunity to question whether those arbitrary borders hold Africa back. After all, in countries such as Nigeria or the Democratic Republic of Congo, disparate cultural groups have tended to band together, competing with one another for finite power and resources, sometimes disastrously. With tribal identities strong and national identities weak (after all, the latter tends to be ancient and deeply rooted, the latter new and artificial), national cooperation can be tough.
Of course, the actual practice of secession and division would be difficult, if it's even functionally possible; Africa's ethnic groups are many, and they don't tend to fall along the cleanest possible lines. The debate over whether or not secession is good for Africa, as Zachary explained, is a complicated and sometimes contentious one. But the simple fact of this debate is a reminder of Africa's unique post-colonial borders, a devil's bargain sacrificing the democratic fundamental of national self-determination for the practical pursuits of peace and independence. And it's another indication of the many ways that colonialism's complicated legacy is still with us, still shaping today's world.
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
Rabia Chaudry, a friend of Syed’s family who spearheaded the campaign to get him a new trial, tweeted: “I am shaking with joy, shaking! Thank you Judge Welch. Thank you.” Judge Martin Welch of the Circuit Court for Baltimore City signed Thursday’s order.
Syed was convicted in 2000 of strangling Hae Min Lee, 18, and burying her body in Baltimore’s Leakin Park. The two had dated when they attended the city’s Woodlawn High School. Syed was sentenced to life plus 30 years in prison for the killing.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
As incomes fall across the nation, even better-off areas like Sheboygan County, Wisconsin, are faltering.
SHEBOYGAN, Wisc.—There is still a sizable middle class in this county of 115,000 on the shores of Lake Michigan, a pleasant hour’s drive from Milwaukee. You can see it in the cars that pour in and out of the parking lots of local factories, in the restaurants packed with older couples on weeknights, and in the bars that seem to be on every single corner. You can see it in the local parks, including one called Field of Dreams, where kids play soccer and baseball and their parents sit and watch.
About 63 percent of adults in Sheboygan make between $41,641 and $124,924, meaning the area has one of the highest shares of middle-class households in the country, according to a report from the Pew Research Center. Nationally, only 51 percent of adults are middle-class.
President Obama signs into law a much-needed update. But will it change anything?
In 1993, Monte Finkelstein filed what would become one of the longest-running requests for government records in history. Eager to complete his book on the relationship between the Allies and the Sicilian mafia during World War II, the community-college professor submitted a Freedom of Information Act request with the National Archives.
Five years later and still waiting for a response, Finkelstein published his book without the documents. “To be honest, I gave up,” he said. “I just surrendered. From my point of view, it was a bureaucratic nightmare.”
On Thursday—a few days before July 4, the Freedom of Information Act’s 50th birthday—President Barack Obama signed one of the most hard-fought FOIA-reform bills in decades. In doing so, he’ll make permanent the presumption that all government records are public unless proven otherwise, a central tenet of his administration’s efforts, and one currently vulnerable to revocation by, say, President Trump. This alone is cause for high-fives among open-government groups, who convinced many states to enshrine that common-sense provision in their laws. Until now, the United States Congress never has.