What newly released documents from the UK colony in Kenya say about the rise of the great powers that have followed.
Four Kenyans protest British colonial-era abuses outside the High Court in London.
This week, the UK Foreign Office released the first in a series of embarrassing government files from the country's colonial era. The release follows a lawsuit by five Kenyans -- four, once one of the original five died -- claiming they were tortured during the anti-colonial Mau Mau uprising in Kenya in the 1950s. The files remind of the relatively recent time when European powers were still relatively free to pursue openly imperial policies, and crimes committed against longstanding colonies barely counted. Some of the papers show, The Guardianreports, that "thousands of documents detailing some of the most shameful acts and crimes committed during the final years of the British empire were systematically destroyed to prevent them falling into the hands of post-independence governments."
The documents release from the British Foreign Office shouldn't just
be an opportunity to point fingers. Though the Kenyans who filed the
suit need to be heard, and there ought to be some sort of accountability
for colonial crimes,
it's a little too easy for those of us in countries with similarly dark pasts to hyperfocus on this one period of British wrongdoing. Colonialism is over, but there are still world powers, and they're still abusing their power. In fact, the exploitations are often similar precisely because the crimes of one superpower often provide the template, or even the impetus, for the abuses of the next powerful state.
A Guardian editor pointed out, "Americans should always resist the easy temptation to take too much moral high ground over the Brits," as "they have their Kenyas" as well, such as slavery or the treatment of Native Americans. It can sometimes seem inevitable that a dominant world power, whether the U.S. or Great Britain or one of the many before and maybe someday after, have some exploitative and even shameful moments in their history. So do most countries, powerful or not.
But it's the exploitative actions of the dominant powers that tend to come back to haunt the wider world. Germany's territorial ambitions, both in the German Empire from 1871 to 1918 and during World War II and its lead-up, were modeled in part on the naked British imperialism of earlier generations. Soviet aggression following World War II had as much to do with watching and experiencing Western European exploitation as it did with Communism and ideology. History sometimes seems to be offered as a justification almost as often as it is offered as an appropriate model.
Part of this phenomenon is that the powerful get to do what they want, and powerful countries tend to want the same things: political, military, or economic control of strategic regions, economic prosperity, etc. But the deeds of onetime powers really do seem to have some effect on the deeds of up-and-coming powers.
Beyond the complex motivations driving, for example, German territorial expansion, there is an overarching pattern. We see it today when developing nations such as India or China protest European and American demands that they make carbon cuts. The "West" industrialized using fossil fuels -- why shouldn't everyone else be able to do the same? To take another example, because the United States developed a nuclear bomb and dropped it on Japan, other countries have used this to reject American demands that they not develop their own nuclear programs.
Right now, we are two years away from the hundredth anniversary of the outbreak of World War I -- a great power turning point of sorts, when the German Empire mounted its first serious military challenge to British hegemony, and, though the challenge was unsuccessful and the war ultimately increased the size of the British Empire, the British colonies started to break free. Over the course of the next few decades, maps had to be redrawn quite a few times. By the end of World War II, it was clear neither Britain nor Germany were going to dominate the twentieth century. The Soviet Union and the United States had already been sizing each other up for several years.
This week, there have been two prominent news stories concerning the U.S.-China relationship. The two states, it seems, have been engaging in cyber "war games" through think tanks, the U.S. aware of China's growing power in this area. On Thursday, U.S. Defense Secretary Leon Panetta publicly accused China of assisting North Korea with its missile program.
You don't need to be worried about China's rise (or the West's maybe-decline) to see a familiar, though probably far less dangerous, re-shifting of power dynamics at work. Maybe China arming North Korea would be, from a world peace standpoint, better or worse than the U.S. arming the mujahideen in Afghanistan or the Contras in Nicaragua. Maybe China's expansion into Tibet has some similarities to the U.S. westward expansion into Native Americans' territory. It's tricky to balance out competing perspectives. But the parallels are tough to miss.
Over the next few decades, however, we may get to watch this pattern play out some more. And colonial Britain, after all, also held Hong Kong. The United States isn't the only world power China has fresh in its memory -- and the U.K. Foreign Office release this week probably won't be the last time imperial pasts suddenly become relevant again.
The kerfuffle over Kim Kardashian's drug-promoting Instagram selfie is nothing new: As long as the agency has existed, it's had to figure out how to regulate drug advertisements in new forms of communication technology.
Last month, celebrity-news and health-policy bloggers had a rare moment of overlap after the Food and Drug Administration issued a warning letter to the pharmaceutical company Duchesnay, which manufactures Diclegis, a prescription-only anti-nausea pill. At stake: a single selfie with pill bottle.
The image that attracted the censure of the FDA was an Instagram posted on July 20 by Kim Kardashian. The image featured her upper torso, right hand, and face, with a bottle of Diclegis prominently displayed in her grasp. “OMG,” the caption began:
Have you heard about this? As you guys know my #morningsickness has been pretty bad. I tried changing things about my lifestyle and my diet, but nothing helped, so I talked to my doctor. He prescribed my Diclegis, I felt better, and most importantly it’s been studied and there is no increased risk to the baby.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
Shedding pounds is usually a losing battle—research suggests it’s better to just focus on building a healthy lifestyle.
“My own history of yo-yo dieting started when I was 15 and lasted about three decades,” said Sandra Aamodt, a neuroscientist and the author of Why Diets Make Us Fat, at the Aspen Ideas Festival on Saturday. “I lost the same 15 pounds pretty much every year during that same period, and gained it back regular as clockwork.”
This is a classic tale—the diet that doesn’t take, the weight loss that comes right back. The most recent, extreme, highly publicized case was that of the study done on contestants from the reality show The Biggest Loser, most of whom, six years after losing 100 to 200 pounds, had gained most of it back, and had significantly slowed metabolisms.
The study provided a dramatic example of how the body fights against weight loss. And sheer force of will is rarely sufficient to fight back.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.
Demographic data shows that a Briton’s education level may be the strongest indication of how he or she voted.
Britain has voted to leave the European Union. The news surprised many people, including the British, who have learned that while brushing off early statistical warnings is tempting, it doesn’t make it any easier when those warnings turn out to be right. Give yourselves a break, I say: Polls are fickle, anecdote is limited, and prevailing wisdom is sometimes impossible to shake. (Though these remorseful Brexit voters don’t have an excuse.)
There’s a silver lining for statistics, however. With the close of Britain’s referendum, political analysts now have a concrete dataset to examine: the actual vote totals in the United Kingdom. This data, when matched with regional demographic information from the U.K. Census, gives insight into who actually voted to leave or remain.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
The U.K.’s vote to leave the European Union betrays a failure of empathy and imagination among its leaders. Will America’s political establishment fare any better?
If there is a regnant consensus among the men and women who steer the Western world, it is this: The globe is flattening. Borders are crumbling. Identities are fluid. Commerce and communications form the warp and woof, weaving nations into the tight fabric of a global economy. People are free to pursue opportunity, enriching their new homes culturally and economically. There may be painful dislocations along the way, but the benefits of globalization heavily outweigh its costs. And those who cannot see this, those who would resist it, those who would undo it—they are ignorant of their own interests, bigoted, xenophobic, and backward.
So entrenched is this consensus that, for decades, in most Western democracies, few mainstream political parties have thought to challenge it. They have left it to the politicians on the margins of the left and the right to give voice to such sentiments—and voicing such sentiments relegated politicians to the margins of political life.