The British royal family is an expensive anachronism and little more.
Queen Elizabeth visits the Dersingham Infant and Nursery School in Dersingham on the 60th anniversary of her rule / Reuters
Today is the sixtieth anniversary of Queen Elizabeth II's ascension to
the British throne, which occurred upon her father's death in 1952.
Happy anniversary -- or Diamond Jubilee, as it is known -- Your Majesty.
Now: what exactly are you still doing there, anyway?
and the royals' tourist appeal aside, there's something a bit jarring
both to logic and to liberal democratic sensibilities about what the
queen stands for. After all, British "citizens" are still at least
nominally, and arguably legally, considered "subjects." The United
Kingdom's Home Office and the passports it issues reflect the
country's switch in 1949 from the language of subjecthood to
citizenship, and thus make a distinction between "citizens of the United
Kingdom" and "British subjects." That's not a particularly pretty
distinction, since the latter is mostly a leftover of the country's
But as plenty of experts have pointed out, there is no piece of paper that
officially designates Brits as "citizens."
And if a magazine-length article
can be written under the headline "Are we subjects or citizens?" as the BBC did in 2005, whatever scraps of citizenship clinging to Britons can't be all that substantial.
The financial side of the British monarchy is no less quirky. Governing for payment is standard, but the queen reigns, which appears mostly to mean visiting things. Strange as this looks from a practical standpoint, it's even stranger in theory. In 2012, why would the people of a Western state pay someone to subjugate them? That Britain is Western matters here not so much because of values but because of history. The British state was arguably the first in the region to be organized along the principles of an explicit social contract; it's the heir to the English Magna Carta in 1215 as well as the Glorious Revolution, where, for the first time, monarchs -- King William and Queen Mary -- were brought in to accept a crown on the subjects' own terms. Yet, in a twist that continues to fascinate historians, William and Mary paved the way for remarkably conservative stability in the ensuing centuries. France, as the trope goes, had a political revolution, Britain had an industrial one. And here the two countries are today, France heading into the final stretch of a presidential election, while a not insignificant portion of the British economy gets poured into preparations for a June-weekend Diamond Jubilee of a figurehead queen, who Britons never explicitly agreed to support.
Though the March 2011 financial report on royal finances proudly announced a 19% decrease in the Queen's official expenditure over the course of five years, is this really much solace? Her family will still spend £32.1 million, quite a lot of money. Remarkably, UK education secretary Michael Gove reportedly also wanted the public to donate a £60 million royal yacht to Her Majesty for the 2012 celebrations, although the details of that proposal are disputed, and private donations were mentioned as well.
Downing Street nixed the public funding idea, fortunately. Prime Minister David Cameron did declare early Monday, though, that "Today is a day to pay tribute to the magnificent service of Her Majesty the Queen." Her "experience, dignity, and quiet authority," he also mentioned are indisputable, but "pay tribute" seems a bit too atavistically close to home for comfort, and Brits don't have as much tribute to give up as they used to. And "magnificent service"? No one doubts the queen keeps a pretty punishing schedule of standing in formal ceremonies and visiting schools for a lady her age -- but there are a few palaces and a lifetime source of income in the deal.
The royal wedding is over. Kate's and Pippa's dresses were fantastic, and the hats were fun. No argument there. As a privately funded theme park, the royals have real potential. The monarchy, so the crown defenders' argument goes, does indeed bring in cash for the country through tourism and from the Crown Estate. But the current set-up is bizarre, and the frenzied yearning for a U.S. equivalent among so many of my American countrymen and women last spring was puzzling. In the cold, clear light of this less glamorous royal event, the monarchy looks like exactly what it is: a major anachronism. Nothing more.
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?
When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.
Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.
A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.
As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.
He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
Some say the so-called sharing economy has gotten away from its central premise—sharing.
This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.
The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.
A controversial treatment shows promise, especially for victims of trauma.
It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)
Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.
Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.
The authors in the running for Britain's most prestigious literary award come from seven countries and include seven women writers.
The longlist for the Man Booker Prize, one of the most prestigious literary awards, was announced Wednesday. For the second year, the prize was open to writers of any nationality who publish books in English in the U.K., and this year five American writers made the list of 13 contenders, chosen by five judges from a pool of 156 total works.
The U.S. is, in fact, the most well-represented country, with other entrants hailing from Great Britain, Jamaica, New Zealand, Nigeria, Ireland, and India. There are three debut novelists and one former winner on the list, and women writers outnumber men seven to six. From dystopian and political novels to a multitude of iterations on the family drama, the selections capture the ever-changing human experience in very different ways.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.