Europe's arbitrary post-colonial borders left Africans bunched into countries that don't represent their heritage, a contradiction that still troubles them today.
South Sudanese officials look at the newly unveiled map of Sudan after separation. (Reuters)
When the nations of Nigeria and Cameroon went to settle a border dispute in 2002, in which both countries claimed an oil-rich peninsula about the size of El Paso, they didn't cite ancient cultural claims to the land, nor the preferences of its inhabitants, nor even their own national interests. Rather, in taking their case to the International Court of Justice, they cited a pile of century-old European paperwork.
Cameroon was once a German colony and Nigeria had been ruled by the British empire; in 1913, the two European powers had negotiated the border between these West African colonies. Cameroon argued that this agreement put the peninsula within their borders. Nigeria said the same. Cameroon's yellowed maps were apparently more persuasive; it won the case, and will officially absorb the Bekassi Peninsula into its borders next month.
The case, as Reuters once explained, "again highlighted Africa's commitment to colonial borders drawn without consideration for those actually living there." African borders, in this thinking, are whatever Europeans happened to have marked down during the 19th and 20th centuries, which is a surprising way to do things given how little these outsider-drawn borders have to do with actual Africans.
In much of the world, national borders have shifted over time to reflect ethnic, linguistic, and sometimes religious divisions. Spain's borders generally enclose the Spanish-speakers of Europe; Slovenia and Croatia roughly encompass ethnic Slovenes and Croats. Thailand is exactly what its name suggests. Africa is different, its nations largely defined not by its peoples heritage but by the follies of European colonialism. But as the continent becomes more democratic and Africans assert desires for national self-determination, the African insistance on maintaining colonial-era borders is facing more popular challenges, further exposing the contradiction engineered into African society half a century ago.
When European colonialism collapsed in the years after World War Two and Africans resumed control of their own continent, sub-Saharan leaders agreed to respect the colonial borders. Not because those borders made any sense -- they are widely considered the arbitrary creations of colonial happenstance and European agreements -- but because "new rulers in Africa made the decision to keep the borders drawn by former colonizers to avoid disruptive conflict amongst themselves," as a Harvard paper on these "artificial states" put it.
Conflict has decreased in Africa since the turbulent 1960s and '70s, and though the continent still has some deeply troubled hotspots, the broader trend in Africa is one of peace, democracy, and growth. The threats of destabilizing war, of coups and counter-coups, have eased since the first independent African leaders pledged to uphold European-drawn borders. But a contradiction remains in the African system: leaders are committed to maintaining consistent borders, and yet as those governments become more democratic, they have to confront the fact that popular will might conflict.
A Kenyan group called the Mombasa Republican Council is just the latest of Africa's now 20-plus separatist movements, according to the Guardian, which has charted them all in an interactive map. The Mombasa group wants the country's coastal region to secede, citing its distinct heritage due to centuries of trade across the Indian Ocean. It's unlikely to happen, but as the Guardian notes it's part of a trend of "encouraged" separatist movements as Africans seem to become more willing and interested in pursuing borders that more closely reflect the continent's diverse ethnic, religious, and linguistic lines.
Consider Angola. In 1575, 100 Portugese families and 400 Portugese troops landed on the African continent's southwestern coast at what is now the city of Luanda. They expanded from there, stopping only when they reached German, Belgian, or British claims. The Portugese consolidated the vast, California-sized holdings into a single colony. The only thing that the people who lived there shared in common was that they answered to Portugese masters, and in 1961 that they rebelled against that rule, which they threw off in 1975. They became the country of Angola, an essentially invented nation meant to represent disparate and ancient cultures as if they had simply materialized out of thin air that very moment. Today, as some Angolans are quick to point out, their country is composed of ten major ethnic groups, who do not necessarily have a history of or an interest in shared nationhood. This may help explain why there are two secessionist groups in Angola today.
Had pre-industrial-era Portugese colonists not pressed so far up along Africa's western coast so quickly, for example, then Africa's seven million Kikongo-speakers might today have their own country. Instead, they are split among three different countries, including Angola, as minorities. The Bundu dia Kongo separatist group, which operates across the region, wants to establish a country that would more closely resemble the old, pre-colonial Kongo Kingdom, and give the Kikongo-speakers a country.
There's no reason to think that Bundia dia Kongo or the Mombasa Republican Council have any chance at establishing sovereign states; their movements are too weak and the states they challenge are too strong. But, as the 2011 division of Sudan into two countries demonstrated, the world can sometimes find some flexibility in the unofficial rule about maintaining colonial African borders. Sudan was an extreme example, an infamously poorly demarcated state that encompassed some of the widest ethnic and religious gulfs in the world, but as G. Pascal Zachary wrote in TheAtlantic.com at the time, it provided an opportunity to question whether those arbitrary borders hold Africa back. After all, in countries such as Nigeria or the Democratic Republic of Congo, disparate cultural groups have tended to band together, competing with one another for finite power and resources, sometimes disastrously. With tribal identities strong and national identities weak (after all, the latter tends to be ancient and deeply rooted, the latter new and artificial), national cooperation can be tough.
Of course, the actual practice of secession and division would be difficult, if it's even functionally possible; Africa's ethnic groups are many, and they don't tend to fall along the cleanest possible lines. The debate over whether or not secession is good for Africa, as Zachary explained, is a complicated and sometimes contentious one. But the simple fact of this debate is a reminder of Africa's unique post-colonial borders, a devil's bargain sacrificing the democratic fundamental of national self-determination for the practical pursuits of peace and independence. And it's another indication of the many ways that colonialism's complicated legacy is still with us, still shaping today's world.
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
Two historians weigh in on how to understand the new administration, press relations, and this moment in political time.
The election of Donald Trump, and the early days of his presidency, have driven many Americans to rummage through history in search of context and understanding. Trump himself has been compared to historical figures ranging from Ronald Reagan to Henry Ford, and from Andrew Jackson to Benito Mussolini. His steps have been condemned as unprecedented by his critics, and praised as historic by his supporters.
To place contemporary events in perspective, we turned to a pair of historians of the United States. Julian Zelizer is a professor of history and public affairs at Princeton University. He is the author, most recently, of The Fierce Urgency of Now: Lyndon Johnson, Congress, and the Battle for the Great Society. Morton Keller is a professor emeritus of history at Brandeis University. He has written or edited more than 15 books, including Obama’s Time: A History. They’ll be exchanging views periodically on how to understand Trump, his presidency, and this moment in political time. —Yoni Appelbaum
Neither truck drivers nor bankers would put up with a system like the one that influences medical residents’ schedules.
The path to becoming a doctor is notoriously difficult. Following pre-med studies and four years of medical school, freshly minted M.D.s must spend anywhere from three to seven years (depending on their chosen specialty) training as “residents” at an established teaching hospital. Medical residencies are institutional apprenticeships—and are therefore structured to serve the dual, often dueling, aims of training the profession’s next generation and minding the hospital’s labor needs.
How to manage this tension between “education and service” is a perennial question of residency training, according to Janis Orlowski, the chief health-care officer of the Association of American Medical Colleges (AAMC). Orlowski says that the amount of menial labor residents are required to perform, known in the profession as “scut work,” has decreased "tremendously" since she was a resident in the 1980s. But she acknowledges that even "institutions that are committed to education … constantly struggle with this,” trying to stay on the right side of the boundary between training and taking advantage of residents.
“The question confronting us as a nation is as consequential as any we have faced since the late 1940s,” a group of Republican and Democratic experts write.
Ben Rhodes, one of Barack Obama’s top advisers, once dismissed the American foreign-policy establishment—those ex-government officials and think-tank scholars and journalists in Washington, D.C. who advocate for a particular vision of assertive U.S. leadership in the world—as the “Blob.” Donald Trump had harsher words. As a presidential candidate, he vowed never to take advice on international affairs from “those who have perfect resumes but very little to brag about except responsibility for a long history of failed policies and continued losses at war.” Both men pointed to one of the Beltway establishment’s more glaring errors: support for the war in Iraq.
Now the Blob is fighting back. The “establishment” has been unfairly “kicked around,” said Robert Kagan, a senior fellow at the Brookings Institution and former official in the Reagan administration. As World War II gave way to the Cold War, President Harry Truman and his secretary of state, Dean Acheson, “invented a foreign policy and sold it successfully to the American people. That’s what containment was and that’s what the Truman Doctrine was. … That was the foreign-policy establishment.” During that period, the U.S. government also helped create a system for restoring order to a world riven by war and economic crisis. That system, which evolved over the course of the Cold War and post-Cold War period, includes an open international economy; U.S. military and diplomatic alliances in Asia, Europe, and the Middle East; and liberal rules and institutions (human rights, the United Nations, and so on).
In late 2015, in the Chilean desert, astronomers pointed a telescope at a faint, nearby star known as ared dwarf. Amid the star’s dim infrared glow, they spotted periodic dips, a telltale sign that something was passing in front of it, blocking its light every so often. Last summer, the astronomers concluded the mysterious dimming came from three Earth-sized planets—and that they were orbiting in the star’s temperate zone, where temperatures are not too hot, and not too cold, but just right for liquid water, and maybe even life.
This was an important find. Scientists for years had focused on stars like our sun in their search for potentially habitable planets outside our solar system. Red dwarfs, smaller and cooler than the sun, were thought to create inhospitable conditions. They’re also harder to see, detectable by infrared rather than visible light. But the astronomers aimed hundreds of hours worth of observations at this dwarf, known as TRAPPIST-1 anyway, using ground-based telescopes around the world and NASA’s Spitzer Space Telescope.
A $100 million gangster epic starring Robert De Niro, Al Pacino, and Joe Pesci has become too risky a proposition for major studios.
Martin Scorsese’s next project, The Irishman, is as close as you can get to a box-office guarantee for the famed director. It’s a gangster film based on a best-selling book about a mob hitman who claimed to have a part in the legendary disappearance of the union boss Jimmy Hoffa. Robert De Niro is attached to play the hitman, Al Pacino will star as Hoffa, and Scorsese favorites Joe Pesci and Harvey Keitel are also on board. After Scorsese branched into more esoteric territory this year with Silence, a meditative exploration of faith and Catholicism, The Irishman sounds like a highly bankable project—the kind studios love. And yet, the film is going to Netflix, which will bankroll its $100 million budget and distribute it around the world on the company’s streaming service.
Plagues, revolutions, massive wars, collapsed states—these are what reliably reduce economic disparities.
Calls to make America great again hark back to a time when income inequality receded even as the economy boomed and the middle class expanded. Yet it is all too easy to forget just how deeply this newfound equality was rooted in the cataclysm of the world wars.
The pressures of total war became a uniquely powerful catalyst of equalizing reform, spurring unionization, extensions of voting rights, and the creation of the welfare state. During and after wartime, aggressive government intervention in the private sector and disruptions to capital holdings wiped out upper-class wealth and funneled resources to workers; even in countries that escaped physical devastation and crippling inflation, marginal tax rates surged upward. Concentrated for the most part between 1914 and 1945, this “Great Compression” (as economists call it) of inequality took several more decades to fully run its course across the developed world until the 1970s and 1980s, when it stalled and began to go into reverse.
Rod Dreher makes a powerful argument for communal religious life in his book, The Benedict Option. But he has not wrestled with how to live side by side with people unlike him.
Donald Trump was elected president with the help of 81 percent of white evangelical voters. Mike Pence, the champion of Indiana’s controversial 2015 religious-freedom law, is his deputy. Neil Gorsuch, a judge deeply sympathetic to religious litigants, will likely be appointed to the Supreme Court. And Republicans hold both chambers of Congress and statehouses across the country. Right now, conservative Christians enjoy more influence on American politics than they have in decades.
And yet, Rod Dreher is terrified.
“Don’t be fooled,” he tells fellow Christians in his new book, The Benedict Option. “The upset presidential victory of Donald Trump has at best given us a bit more time to prepare for the inevitable.”
Consolidated corporate power is keeping many products’ prices high and quality low. Why aren’t more politicians opposing it?
There are many competing interpretations for why Hillary Clinton lost last fall’s election, but most observers do agree that economics played a big role. Clinton simply didn’t articulate a vision compelling enough to compete with Donald Trump’s rousing, if dubious, message that bad trade deals and illegal immigration explain the downward mobility of so many Americans.
As it happens, Clinton did have the germ of exactly such an idea—if one knew where to look. In an October 2015 op-ed, she wrote that “large corporations are concentrating control over markets” and “using their power to raise prices, limit choices for consumers, lower wages for workers, and hold back competition from startups and small businesses. It’s no wonder Americans feel the deck is stacked for those at the top.” In a speech in Toledo last fall, Clinton assailed “old-fashioned monopolies” and vowed to appoint “tough” enforcers “so the big don’t keep getting bigger and bigger.”
You can tell a lot about a person from how they react to something.
That’s why Facebook’s various “Like” buttons are so powerful. Clicking a reaction icon isn’t just a way to register an emotional response, it’s also a way for Facebook to refine its sense of who you are. So when you “Love” a photo of a friend’s baby, and click “Angry” on an article about the New England Patriots winning the Super Bowl, you’re training Facebook to see you a certain way: You are a person who seems to love babies and hate Tom Brady.
The more you click, the more sophisticated Facebook’s idea of who you are becomes. (Remember: Although the reaction choices seem limited now—Like, Love, Haha, Wow, Sad, or Angry—up until around this time last year, there was only a “Like” button.)