The British royal family is an expensive anachronism and little more.
Queen Elizabeth visits the Dersingham Infant and Nursery School in Dersingham on the 60th anniversary of her rule / Reuters
Today is the sixtieth anniversary of Queen Elizabeth II's ascension to
the British throne, which occurred upon her father's death in 1952.
Happy anniversary -- or Diamond Jubilee, as it is known -- Your Majesty.
Now: what exactly are you still doing there, anyway?
and the royals' tourist appeal aside, there's something a bit jarring
both to logic and to liberal democratic sensibilities about what the
queen stands for. After all, British "citizens" are still at least
nominally, and arguably legally, considered "subjects." The United
Kingdom's Home Office and the passports it issues reflect the
country's switch in 1949 from the language of subjecthood to
citizenship, and thus make a distinction between "citizens of the United
Kingdom" and "British subjects." That's not a particularly pretty
distinction, since the latter is mostly a leftover of the country's
But as plenty of experts have pointed out, there is no piece of paper that
officially designates Brits as "citizens."
And if a magazine-length article
can be written under the headline "Are we subjects or citizens?" as the BBC did in 2005, whatever scraps of citizenship clinging to Britons can't be all that substantial.
The financial side of the British monarchy is no less quirky. Governing for payment is standard, but the queen reigns, which appears mostly to mean visiting things. Strange as this looks from a practical standpoint, it's even stranger in theory. In 2012, why would the people of a Western state pay someone to subjugate them? That Britain is Western matters here not so much because of values but because of history. The British state was arguably the first in the region to be organized along the principles of an explicit social contract; it's the heir to the English Magna Carta in 1215 as well as the Glorious Revolution, where, for the first time, monarchs -- King William and Queen Mary -- were brought in to accept a crown on the subjects' own terms. Yet, in a twist that continues to fascinate historians, William and Mary paved the way for remarkably conservative stability in the ensuing centuries. France, as the trope goes, had a political revolution, Britain had an industrial one. And here the two countries are today, France heading into the final stretch of a presidential election, while a not insignificant portion of the British economy gets poured into preparations for a June-weekend Diamond Jubilee of a figurehead queen, who Britons never explicitly agreed to support.
Though the March 2011 financial report on royal finances proudly announced a 19% decrease in the Queen's official expenditure over the course of five years, is this really much solace? Her family will still spend £32.1 million, quite a lot of money. Remarkably, UK education secretary Michael Gove reportedly also wanted the public to donate a £60 million royal yacht to Her Majesty for the 2012 celebrations, although the details of that proposal are disputed, and private donations were mentioned as well.
Downing Street nixed the public funding idea, fortunately. Prime Minister David Cameron did declare early Monday, though, that "Today is a day to pay tribute to the magnificent service of Her Majesty the Queen." Her "experience, dignity, and quiet authority," he also mentioned are indisputable, but "pay tribute" seems a bit too atavistically close to home for comfort, and Brits don't have as much tribute to give up as they used to. And "magnificent service"? No one doubts the queen keeps a pretty punishing schedule of standing in formal ceremonies and visiting schools for a lady her age -- but there are a few palaces and a lifetime source of income in the deal.
The royal wedding is over. Kate's and Pippa's dresses were fantastic, and the hats were fun. No argument there. As a privately funded theme park, the royals have real potential. The monarchy, so the crown defenders' argument goes, does indeed bring in cash for the country through tourism and from the Crown Estate. But the current set-up is bizarre, and the frenzied yearning for a U.S. equivalent among so many of my American countrymen and women last spring was puzzling. In the cold, clear light of this less glamorous royal event, the monarchy looks like exactly what it is: a major anachronism. Nothing more.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
A pastor and a rabbi talk about kids, poop, and tearing down the patriarchy in institutional religion.
The Bible is a man’s book. It was mostly written by men, for men, and about men. The people who then interpreted the text have also been predominately male.
No wonder there’s not much theology preoccupied with weird-colored poop and the best way to weather tantrums. Throughout history, childcare has largely been considered women’s work—and, by extension, not theologically serious.
Danya Ruttenberg—a Conservative rabbi whose book about parenting came out in April—disagrees. So does Bromleigh McCleneghan, a Chicago-area pastor and the author of a 2012 book about parenting and a forthcoming book about Christians and sex. Both women have made their careers in writing and ministry. But they’re also both moms, and they believe the work they do as parents doesn’t have to remain separate from the work they do as theologians.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
...isn't something that can be done on campus. It's an internship.
When I was 17, if you asked me how I planned on getting a job in the future, I think I would have said: Get into the right college. When I was 18, if you asked me the same question, I would have said: Get into the right classes. When I was 19: Get good grades.
But when employers recently named the most important elements in hiring a recent graduate, college reputation, GPA, and courses finished at the bottom of the list. At the top, according to the Chronicle of Higher Education, were experiences outside of academics: Internships, jobs, volunteering, and extracurriculars.
What Employers Want
"When employers do hire from college, the evidence suggests that academic skills are not their primary concern," says Peter Cappelli, a Wharton professor and the author of a new paper on job skills. "Work experience is the crucial attribute that employers want even for students who have yet to work full-time."
For too long now, Game of Thrones viewers have been separated into two distinct camps, as unlike each other as Lannisters and Starks: There are those who’ve read the books, and those who haven’t. Promotional material for new seasons has historically been greeted with rabid excitement by newbies, and satisfied nods from smug readers who knew what was coming down the pike. But with the upcoming sixth season, we’ve finally evolved beyond these petty divisions. We can unite and watch the new trailer in awe together, wondering which characters will perish, who will be usurped, and whether Jon Snow is really dead or not.
Scored to a cover version of Chris Isaak’s “Wicked Game” by James Vincent McMorrow, the trailer is full of action and bombast, but delivered at a strangely mournful tempo. That makes sense considering the abject misery most of the cast was in last year: Khaleesi Daenerys deposed from her throne in Slaver’s Bay, Queen Cersei forced to do a nude walk of shame through King’s Landing by religious zealots, Arya blinded by a guild of assassins, and Jon stabbed to death by his compatriots in the Night’s Watch. The trailer keeps things vague, but it’s nice to see Daenerys, Arya, Cersei, Sansa, and others all on their feet again, looking to avenge past wrongs.
How the North Vietnamese remember the conflict 40 years after the fall of Saigon
HANOI, VIETNAM—Forty years ago, on April 30, 1975, Nguyen Dang Phat experienced the happiest day of his life.
That morning, as communist troops swept into the South Vietnamese capital of Saigon and forced the U.S.-backed government to surrender, the North Vietnamese Army soldier marked the end of the war along with a crowd of people in Hanoi. The city was about to become the capital of a unified Vietnam. “All the roads were flooded by people holding flags,” Nguyen, now 65, told me recently. “There were no bombs or airplane sounds or screaming. The happy moment was indescribable.”
The event, known in the United States as the fall of Saigon and conjuring images of panicked Vietnamese trying to crowd onto helicopters to be evacuated, is celebrated as Reunification Day here in Hanoi. The holiday involves little explicit reflection on the country’s 15-year-plus conflict, in which North Vietnam and its supporters in the South fought to unify the country under communism, and the U.S. intervened on behalf of South Vietnam’s anti-communist government. More than 58,000 American soldiers died in the fighting between 1960 and 1975; the estimated number of Vietnamese soldiers and civilians killed on both sides varies widely, from 2.1 million to 3.8 million during the American intervention and in related conflicts before and after.