China considers cracking down on outspoken musicians by requiring them to have a degree. Why?
Sir Elton John receives an Honorary Doctorate from the Royal Academy of
Music at the Academy's 2002 graduation ceremony, London, July 3, 2002. (Peter Macdiarmid/Reuters)
While it is axiomatic these days that you need to go to college to get ahead in life, there are a few professions in which having a four-year degree isn't
strictly necessary. Basketball players, for example, need not have a degree to be successful at their craft, something LeBron James (among others) has
Music would seem to be another case in which schooling plays a minor role in determining success. But in China, this may soon cease to be the case.
According to a recent article in The Guardian, Chinese culture minister Cai Wu has apparently demanded that all foreign musicians who perform in the
country have college degrees.
Somewhere, Justin Bieber just hired an SAT tutor.
Cai's suggestion comes on the heels of an Elton John concert in Beijing last November, after which the musical icon dedicated his performance to the
"spirit and talent" of dissident artist Ai Weiwei. Though John was permitted to play a subsequent show in Guangzhou in December, his comments did not sit
well with China's authorities, and it is unclear whether he'll be invited back to play in the country.
China has long been highly sensitive to celebrity statements about the country. In 2008, the Icelandic singer Bjork triggered a minor scandal when she
shouted "Tibet!" three times at a concert in Shanghai, apparently in the misguided hope that her screams would foment an independence movement in the
Celebrities the world over have always shown the predilection toward speaking out on politics. The question, then, is why China cares what they say. A
country with the world's second-largest economy, one would think, has bigger matters to attend to than a singer who peaked in popularity more than three decades
ago. But by floating his absurd proposal that foreign performers have university degrees, Cai Wu managed to perpetuate China's image as a petty country
unable to take criticism.
Then again, Cai and other Chinese leaders have another audience in mind: China's people.
Beijing has long gone to great lengths to combat real or perceived
slights to its image, a policy that bolsters its reputation as the defenders of China's national honor. It's this defense, accompanied by its stewardship
of the economy, that gives the ruling Communist Party its legitimacy. So while it's unlikely that many Chinese even knew about Elton John's comments, a
fair number would be pleased to hear that the government won't take a foreign star's comments lying down.
Better-behaved foreign musicians need not be deterred from entering the Chinese market, however. The recent Spring Festival gala, an annual variety show
commemorating the Chinese New Year, featured a performance by the lithe Canadian singer Celine Dion. And while none could deny her immense talent, Dion, it should be
noted, never went to college.
Burning Man is underway in the Nevada desert, the migrant crisis grew in both scale and impact, new Star Wars toys went on sale worldwide, China marked the 70thanniversary of the end of World War II, Alaska’s Mt. McKinley was renamed Denali, and much more.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
ISIS did not merely blast apart old stones—it attacked the very foundations of pluralistic society.
If the ruined ruins of Palmyra could speak, they would marvel at our shock. After all, they have been sacked before. In their mute and shattered eloquence, they spoke for centuries not only about the cultures that built them but also about the cultures that destroyed them—about the fragility of civilization itself, even when it is incarnated in stone. No designation of sanctity, by God or by UNESCO, suffices to protect the past. The past is helpless. Instead these ruins, all ruins, have had the effect of lifting the past out of history and into time. They carry the spectator away from facts and toward reveries.
In the 18th century, after the publication in London of The Ruins of Palmyra, a pioneering volume of etchings by Robert Wood, who had traveled to the Syrian desert with the rather colorful James Dawkins, a fellow antiquarian and politician, the desolation of Palmyra became a recurring symbol for ephemerality and the vanity of all human endeavors. “It is the natural and common fate of cities,” Wood dryly remarked in one of the essays in his book, “to have their memory longer preserved than their ruins.” Wood’s beautiful and meticulous prints served as inspirations for paintings, and it was in response to one of those paintings that Diderot wrote some famous pages in his great Salons of 1767: “The ideas ruins evoke in me are grand. Everything comes to nothing, everything perishes, everything passes, only the world remains, only time endures. ... Wherever I cast my glance, the objects surrounding me announce death and compel my resignation to what awaits me. What is my ephemeral existence in comparison with that of a rock being worn down, of a valley being formed, of a forest that’s dying, of these deteriorating masses suspended above my head? I see the marble of tombs crumble into powder and I don’t want to die!”
A GOP law on judicial appointments has been thrown out, and now it’s the judiciary itself that hangs in the balance.
What began as merely a fiscal mess in Kansas has become a full-blown judicial crisis.
On Wednesday, a district court ruled against the state, and threw out a 2014 law passed by Republicans that took the power of appointing chief judges away from the Kansas Supreme Court and handed it to local judges. But that rather simple question of judicial administration could have further-reaching consequences, thanks to a provision in a second law passed by the legislature earlier this spring that would cut off funding for the state’s entire court system, if the 2014 law was struck down.
Kansas officials were so worried about the consequences of the court’s decision that the state’s attorney general, Derek Schmidt, successfully filed to have the ruling stayed until the courts rule on an appeal and the validity of the 2015 law.
Would Donald Trump still seem like a good leader if his reality television show had offered an unedited view of his style?
When Donald Trump flirted with a 2012 presidential run, I argued that starring on The Apprentice had helped him to build a brand that any politician would envy: decisive, averse to bullshit, impossible to swindle, and guided in all decisions by common sense. Kevin Drum has similar thoughts about the billionaire’s appeal in the 2016 primary. After describing The Celebrity Apprentice to his readers, Drum urged them to reflect on how the hit show made Trump look to millions of NBC viewers:
He is running things. He sets the tasks. The competitors all call him ‘Mr. Trump’ and treat him obsequiously. He gives orders and famous people accept them without quibble. At the end of the show, he asks tough questions and demands accountability. He is smooth and unruffled while the team members are tense and tongue-tied. Finally, having given everything the five minutes of due diligence it needs, he takes charge and fires someone. And on the season finale, he picks a big winner and in the process raises lots of money for charity. Do you see how precisely this squares with so many people's view of the presidency?
The president is the guy running things. He tells people what to do. He commands respect simply by virtue of his personality and rock-solid principles. When things go wrong, he doesn't waste time. He gets to the bottom of the problem in minutes using little more than common sense, and then fires the person responsible. And in the end, it's all for a good cause.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
It’s not just Trump: With Ben Carson and Carly Fiorina on the rise, Republicans are loving outsiders and shunning politicians.
For the first time in a long time, Donald Trump isn’t the most interesting story in the 2016 presidential race. That's partly because his dominance in the Republican polls, while still surprising, is no longer novel and increasingly well explored and explained, but it’s also partly because what’s going on with the rest of the GOP field is far more interesting.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
Thicker ink, fewer smudges, and more strained hands: an Object Lesson
Recently, Bic launched acampaign to “save handwriting.” Named “Fight for Your Write,” it includes a pledge to “encourage the act of handwriting” in the pledge-taker’s home and community, and emphasizes putting more of the company’s ballpoints into classrooms.
As a teacher, I couldn’t help but wonder how anyone could think there’s a shortage. I find ballpoint pens all over the place: on classroom floors, behind desks. Dozens of castaways collect in cups on every teacher’s desk. They’re so ubiquitous that the word “ballpoint” is rarely used; they’re just “pens.” But despite its popularity, the ballpoint pen is relatively new in the history of handwriting, and its influence on popular handwriting is more complicated than the Bic campaign would imply.