The British royal family is an expensive anachronism and little more.
Queen Elizabeth visits the Dersingham Infant and Nursery School in Dersingham on the 60th anniversary of her rule / Reuters
Today is the sixtieth anniversary of Queen Elizabeth II's ascension to
the British throne, which occurred upon her father's death in 1952.
Happy anniversary -- or Diamond Jubilee, as it is known -- Your Majesty.
Now: what exactly are you still doing there, anyway?
and the royals' tourist appeal aside, there's something a bit jarring
both to logic and to liberal democratic sensibilities about what the
queen stands for. After all, British "citizens" are still at least
nominally, and arguably legally, considered "subjects." The United
Kingdom's Home Office and the passports it issues reflect the
country's switch in 1949 from the language of subjecthood to
citizenship, and thus make a distinction between "citizens of the United
Kingdom" and "British subjects." That's not a particularly pretty
distinction, since the latter is mostly a leftover of the country's
But as plenty of experts have pointed out, there is no piece of paper that
officially designates Brits as "citizens."
And if a magazine-length article
can be written under the headline "Are we subjects or citizens?" as the BBC did in 2005, whatever scraps of citizenship clinging to Britons can't be all that substantial.
The financial side of the British monarchy is no less quirky. Governing for payment is standard, but the queen reigns, which appears mostly to mean visiting things. Strange as this looks from a practical standpoint, it's even stranger in theory. In 2012, why would the people of a Western state pay someone to subjugate them? That Britain is Western matters here not so much because of values but because of history. The British state was arguably the first in the region to be organized along the principles of an explicit social contract; it's the heir to the English Magna Carta in 1215 as well as the Glorious Revolution, where, for the first time, monarchs -- King William and Queen Mary -- were brought in to accept a crown on the subjects' own terms. Yet, in a twist that continues to fascinate historians, William and Mary paved the way for remarkably conservative stability in the ensuing centuries. France, as the trope goes, had a political revolution, Britain had an industrial one. And here the two countries are today, France heading into the final stretch of a presidential election, while a not insignificant portion of the British economy gets poured into preparations for a June-weekend Diamond Jubilee of a figurehead queen, who Britons never explicitly agreed to support.
Though the March 2011 financial report on royal finances proudly announced a 19% decrease in the Queen's official expenditure over the course of five years, is this really much solace? Her family will still spend £32.1 million, quite a lot of money. Remarkably, UK education secretary Michael Gove reportedly also wanted the public to donate a £60 million royal yacht to Her Majesty for the 2012 celebrations, although the details of that proposal are disputed, and private donations were mentioned as well.
Downing Street nixed the public funding idea, fortunately. Prime Minister David Cameron did declare early Monday, though, that "Today is a day to pay tribute to the magnificent service of Her Majesty the Queen." Her "experience, dignity, and quiet authority," he also mentioned are indisputable, but "pay tribute" seems a bit too atavistically close to home for comfort, and Brits don't have as much tribute to give up as they used to. And "magnificent service"? No one doubts the queen keeps a pretty punishing schedule of standing in formal ceremonies and visiting schools for a lady her age -- but there are a few palaces and a lifetime source of income in the deal.
The royal wedding is over. Kate's and Pippa's dresses were fantastic, and the hats were fun. No argument there. As a privately funded theme park, the royals have real potential. The monarchy, so the crown defenders' argument goes, does indeed bring in cash for the country through tourism and from the Crown Estate. But the current set-up is bizarre, and the frenzied yearning for a U.S. equivalent among so many of my American countrymen and women last spring was puzzling. In the cold, clear light of this less glamorous royal event, the monarchy looks like exactly what it is: a major anachronism. Nothing more.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
Live in anticipation, gathering stories and memories. New research builds on the vogue mantra of behavioral economics.
Forty-seven percent of the time, the average mind is wandering. It wanders about a third of the time while a person is reading, talking with other people, or taking care of children. It wanders 10 percent of the time, even, during sex. And that wandering, according to psychologist Matthew Killingsworth, is not good for well-being. A mind belongs in one place. During his training at Harvard, Killingsworth compiled those numbers and built a scientific case for every cliché about living in the moment. In a 2010 Science paper co-authored with psychology professor Daniel Gilbert, the two wrote that "a wandering mind is an unhappy mind."
For Killingsworth, happiness is in the content of moment-to-moment experiences. Nothing material is intrinsically valuable, except in whatever promise of happiness it carries. Satisfaction in owning a thing does not have to come during the moment it's acquired, of course. It can come as anticipation or nostalgic longing. Overall, though, the achievement of the human brain to contemplate events past and future at great, tedious length has, these psychologists believe, come at the expense of happiness. Minds tend to wander to dark, not whimsical, places. Unless that mind has something exciting to anticipate or sweet to remember.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”
It was widely seen as a counter-argument to claims that poor people are "to blame" for bad decisions and a rebuke to policies that withhold money from the poorest families unless they behave in a certain way. After all, if being poor leads to bad decision-making (as opposed to the other way around), then giving cash should alleviate the cognitive burdens of poverty, all on its own.
Sometimes, science doesn't stick without a proper anecdote, and "Why I Make Terrible Decisions," a comment published on Gawker's Kinja platform by a person in poverty, is a devastating illustration of the Science study. I've bolded what I found the most moving, insightful portions, but it's a moving and insightful testimony all the way through.
The University of Chicago asks a group of academics about gift-giving and the holidays. Their responses will melt your heart.
Cash is the most efficient gift, according to economists. Cash is also a terrible gift, according to economists. By guaranteeing that the recipient can buy exactly what she wants, you guarantee that the recipient will consider you an unemotional robot.
That's why the vast majority of economists in the University of Chicago's IGM poll said it's absurd to give cash to loved ones for the holidays. "In some cases," Steven Kaplan said, in a stirring defense for thoughtful gifts, "non-pecuniary [not cash-related] values are important."
Non-pecuniary values are important! I guess so. But can you imagine a more wooden explication of love? Can you imagine a more wooden explanation of anything? Just think of the Christmas card...
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
Students at Princeton University are protesting the ways it honors the former president, who once threw a civil-rights leader out of the White House.
The Black Justice League, in protests on Princeton University’s campus, has drawn wider attention to an inconvenient truth about the university’s ultimate star: Woodrow Wilson. The Virginia native was racist, a trait largely overshadowed by his works as Princeton’s president, as New Jersey’s governor, and, most notably, as the 28th president of the United States.
As president, Wilson oversaw unprecedented segregation in federal offices. It’s a shameful side to his legacy that came to a head one fall afternoon in 1914 when he threw the civil-rights leader William Monroe Trotter out of the Oval Office.
Trotter led a delegation of blacks to meet with the president on November 12, 1914 to discuss the surge of segregation in the country. Trotter, today largely forgotten, was a nationally prominent civil-rights leader and newspaper editor. In the early 1900s, he was often mentioned in the same breath as W.E.B. Du Bois and Booker T. Washington. But unlike Washington, Trotter, an 1895 graduate of Harvard, believed in direct protest actions. In fact, Trotter founded his Boston newspaper, The Guardian, as a vehicle to challenge Washington’s more conciliatory approach to civil rights.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.