The British royal family is an expensive anachronism and little more.
Queen Elizabeth visits the Dersingham Infant and Nursery School in Dersingham on the 60th anniversary of her rule / Reuters
Today is the sixtieth anniversary of Queen Elizabeth II's ascension to
the British throne, which occurred upon her father's death in 1952.
Happy anniversary -- or Diamond Jubilee, as it is known -- Your Majesty.
Now: what exactly are you still doing there, anyway?
and the royals' tourist appeal aside, there's something a bit jarring
both to logic and to liberal democratic sensibilities about what the
queen stands for. After all, British "citizens" are still at least
nominally, and arguably legally, considered "subjects." The United
Kingdom's Home Office and the passports it issues reflect the
country's switch in 1949 from the language of subjecthood to
citizenship, and thus make a distinction between "citizens of the United
Kingdom" and "British subjects." That's not a particularly pretty
distinction, since the latter is mostly a leftover of the country's
But as plenty of experts have pointed out, there is no piece of paper that
officially designates Brits as "citizens."
And if a magazine-length article
can be written under the headline "Are we subjects or citizens?" as the BBC did in 2005, whatever scraps of citizenship clinging to Britons can't be all that substantial.
The financial side of the British monarchy is no less quirky. Governing for payment is standard, but the queen reigns, which appears mostly to mean visiting things. Strange as this looks from a practical standpoint, it's even stranger in theory. In 2012, why would the people of a Western state pay someone to subjugate them? That Britain is Western matters here not so much because of values but because of history. The British state was arguably the first in the region to be organized along the principles of an explicit social contract; it's the heir to the English Magna Carta in 1215 as well as the Glorious Revolution, where, for the first time, monarchs -- King William and Queen Mary -- were brought in to accept a crown on the subjects' own terms. Yet, in a twist that continues to fascinate historians, William and Mary paved the way for remarkably conservative stability in the ensuing centuries. France, as the trope goes, had a political revolution, Britain had an industrial one. And here the two countries are today, France heading into the final stretch of a presidential election, while a not insignificant portion of the British economy gets poured into preparations for a June-weekend Diamond Jubilee of a figurehead queen, who Britons never explicitly agreed to support.
Though the March 2011 financial report on royal finances proudly announced a 19% decrease in the Queen's official expenditure over the course of five years, is this really much solace? Her family will still spend £32.1 million, quite a lot of money. Remarkably, UK education secretary Michael Gove reportedly also wanted the public to donate a £60 million royal yacht to Her Majesty for the 2012 celebrations, although the details of that proposal are disputed, and private donations were mentioned as well.
Downing Street nixed the public funding idea, fortunately. Prime Minister David Cameron did declare early Monday, though, that "Today is a day to pay tribute to the magnificent service of Her Majesty the Queen." Her "experience, dignity, and quiet authority," he also mentioned are indisputable, but "pay tribute" seems a bit too atavistically close to home for comfort, and Brits don't have as much tribute to give up as they used to. And "magnificent service"? No one doubts the queen keeps a pretty punishing schedule of standing in formal ceremonies and visiting schools for a lady her age -- but there are a few palaces and a lifetime source of income in the deal.
The royal wedding is over. Kate's and Pippa's dresses were fantastic, and the hats were fun. No argument there. As a privately funded theme park, the royals have real potential. The monarchy, so the crown defenders' argument goes, does indeed bring in cash for the country through tourism and from the Crown Estate. But the current set-up is bizarre, and the frenzied yearning for a U.S. equivalent among so many of my American countrymen and women last spring was puzzling. In the cold, clear light of this less glamorous royal event, the monarchy looks like exactly what it is: a major anachronism. Nothing more.
Why haven’t more challengers entered the race to defeat the Iraq War hawk, Patriot Act supporter, and close friend of big finance?
As Hillary Clinton loses ground to Bernie Sanders in Iowa, where her lead shrinks by the day, it’s worth noticing that she has never made particular sense as the Democratic Party’s nominee. She may be more electable than her social-democratic rival from Vermont, but plenty of Democrats are better positioned to represent the center-left coalition. Why have they let the former secretary of state keep them out of the race? If Clinton makes it to the general election, I understand why most Democrats will support her. She shares their views on issues as varied as preserving Obamacare, abortion rights, extending legal status to undocumented workers, strengthening labor unions, and imposing a carbon tax to slow climate change.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
If the Fourteenth Amendment means that the children of undocumented immigrants are not citizens, as Donald Trump suggests, then they are also not subject to American laws.
Imagine the moon rising majestically over the Tonto National Forest, highlighting the stark desert scenery along the Superstition Freeway just west of Morristown, Arizona. The sheriff of Maricopa County sips coffee from his thermos and checks that his radar gun is on the ready. A lot of lawmen wouldn’t have bothered to send officers out at night on such a lonely stretch of road, much less taken the night shift themselves. But America’s Toughest Sheriff sets a good example for his deputies. As long as he’s the sheriff, at least, the rule of law—and the original intent of the Constitution—will be enforced by the working end of a nightstick.
Suddenly a car rockets by, going 100 miles an hour by the gun. Siren ululating, the sheriff heads west after the speeder. The blue Corolla smoothly pulls over to the shoulder. The sheriff sees the driver’s side window roll down. Cautiously he approaches.
Many educators are introducing meditation into the classroom as a means of improving kids’ attention and emotional regulation.
A five-minute walk from the rickety, raised track that carries the 5 train through the Bronx, the English teacher Argos Gonzalez balanced a rounded metal bowl on an outstretched palm. His class—a mix of black and Hispanic students in their late teens, most of whom live in one of the poorest districts in New York City—by now were used to the sight of this unusual object: a Tibetan meditation bell.
“Today we’re going to talk about mindfulness of emotion,” Gonzalez said with a hint of a Venezuelan accent. “You guys remember what mindfulness is?” Met with quiet stares, Gonzalez gestured to one of the posters pasted at the back of the classroom, where the students a few weeks earlier had brainstormed terms describing the meaning of “mindfulness.” There were some tentative mumblings: “being focused,” “being aware of our surroundings.”
Though it wasn’t pretty, Minaj was really teaching a lesson in civility.
Nicki Minaj didn’t, in the end, say much to Miley Cyrus at all. If you only read the comments that lit up the Internet at last night’s MTV Video Music Awards, you might think she was kidding, or got cut off, when she “called out” the former Disney star who was hosting: “And now, back to this bitch that had a lot to say about me the other day in the press. Miley, what’s good?”
To summarize: When Minaj’s “Anaconda” won the award for Best Hip-Hop Video, she took to the stage in a slow shuffle, shook her booty with presenter Rebel Wilson, and then gave an acceptance speech in which she switched vocal personas as amusingly as she does in her best raps—street-preacher-like when telling women “don’t you be out here depending on these little snotty-nosed boys”; sweetness and light when thanking her fans and pastor. Then a wave of nausea seemed to come over her, and she turned her gaze toward Cyrus. To me, the look on her face, not the words that she said, was the news of the night:
After calling his intellectual opponents treasonous, and allegedly exaggerating his credentials, a controversial law professor resigns from the United States Military Academy.
On Monday, West Point law professor William C. Bradford resigned after The Guardianreported that he had allegedly inflated his academic credentials. Bradford made headlines last week, when the editors of the National Security Law Journaldenounced a controversial article by him in their own summer issue:
As the incoming Editorial Board, we want to address concerns regarding Mr. Bradford’s contention that some scholars in legal academia could be considered as constituting a fifth column in the war against terror; his interpretation is that those scholars could be targeted as unlawful combatants. The substance of Mr. Bradford’s article cannot fairly be considered apart from the egregious breach of professional decorum that it exhibits. We cannot “unpublish” it, of course, but we can and do acknowledge that the article was not presentable for publication when we published it, and that we therefore repudiate it with sincere apologies to our readers.
When cobbling together a livable income, many of America’s poorest people rely on the stipends they receive for donating plasma.
There is no money to be made selling blood anymore. It can, however, pay off to sell plasma, a component in blood that is used in a number of treatments for serious illnesses. It is legal to “donate” plasma up to two times a week, for which a bank will pay around $30 each time. Selling plasma is so common among America’s extremely poor that it can be thought of as their lifeblood.
But no one could reasonably think of a twice-weekly plasma donation as a job. It’s a survival strategy, one of many operating well outside the low-wage job market.
In Johnson City, Tennessee, we met a 21-year-old who donates plasma as often as 10 times a month—as frequently as the law allows. (The terms of our research prevent us from revealing her identity.) She is able to donate only when her husband has time to keep an eye on their two young daughters. When we met him in February, he could do that pretty frequently because he’d been out of work since the beginning of December, when McDonald’s reduced his hours to zero in response to slow foot traffic. Six months ago, walking his wife to the plasma clinic and back, kids in tow, was the most important job he had.
Meaningful work, argues psychologist Barry Schwartz, shouldn't be a luxury. It should be a feature of every job, from CEO to factory worker.
There’s a belief that what gets some workers to keep coming into work every day is their “psychic wages”—the fulfillment that comes with doing meaningful work. That thinking is usually applied to authors, or doctors, or social workers, but the assumption for why a different class of workers—janitors, factory workers, call-center employees—keeps showing up every day is often simpler: They aren’t there for anything but money.
But Barry Schwartz, a professor of psychology at Swarthmore College, believes that jobs are about more than money, for both blue- and white-collar workers alike. When he was trained as a psychologist, decades ago, the thinking of B. F. Skinner—of Skinner Box fame—dominated the field. Skinner’s view of human nature was that every action can be explained through the lens of rewards and punishment: If someone wasn’t doing something, he or she simply wasn’t getting a sufficient reward for it. “And that always struck me as wrong—at least, as a description of human beings, as incomplete,” Schwartz told me.
Alaska has more than $50 billion of oil money in the bank. Why can’t it pay its bills?
WASILLA, Alaska—This state has more money in the bank than most small countries. Decades of collecting royalties and revenues from the companies that drilled for oil on its slopes have endowed Alaska with a $50 billion savings account. Residents pay neither income nor sales tax, and every October, they get a check from the government simply for living in Alaska—this year, the checkcould total $2,000.
But the years of plenty may be coming to an end. As the price of oil has fallen from more than $100 a barrel to around $40 and oil production slows, Alaska is seeing the downside of relying on natural resources to pay the bills. For every $5 drop in oil prices, the state loses $120 million, according to Randall Hoffbeck, Commissioner of the Alaska Department of Revenue.
The top lobbyist for the agreement, along with John Kerry’s former chief of staff, answer a prominent critic’s questions for President Obama.
Last week, I posted a series of sharp, critical questions addressed to President Obama from Robert Satloff, the executive director of the Washington Institute for Near East Policy, about the Iran nuclear deal. A number of administration surrogates have subsequently offered me their own answers to Satloff’s questions. (I’ve asked administration officials to answer the questions as well, but I’m not sure the White House is gripped by the same sense of urgency to answer such questions as it once was, considering that Obama is on a clear path to victory now against his congressional opponents.)
But two people particularly relevant to this debate—Jeremy Ben-Ami, the head of J Street, the pro-Obama, anti-Netanyahu Jewish organization, who is also the de facto chief pro-deal lobbyist inside (and outside) the Jewish community; and David Wade, John Kerry’s former chief of staff, who is now helping J Street in its pro-deal campaign—have offered me lengthy answers, and I’m publishing them below. I might do one more round of this by asking an opponent of the deal to respond to Ben-Ami and Wade. (Satloff hasn’t actually come out against the deal.)