Artists and creative types will experience high growth through 2018, according to new report from the National Endowment for the Arts.
If you can't imagine doing anything outside of the arts, you're in luck. Designers, writers, painters, and other creatives have a surprisingly bright future, according to a new report from the Bureau of Labor Statistics commissioned by the National Endowment for the Arts.
Jobs for artists are projected to grow by 11 percent this decade, a bit faster than the overall workforce. The BLS category "artist occupations" is a catch-all for actors, architects, dancers, designers, photographers, and other cultural workers like interpreters and archivists.
Some of these so-called "creatives" will enjoy explosive growth. The NEA predicts that museum technicians and conservators will grow the most (by 26 percent) between 2008 and 2018, followed by curators (23 percent), interior designers (19 percent), architects (16 percent), writers and authors (15 percent) and actors (13 percent).
As austerity descends on the states, theater and art budgets are fighting to keep taxpayer support. So where is the growth coming from? It's coming from the private sector, where decidedly un-artistic occupations like engineering, technology, and health care are siphoning artists from the freelance world. For example, the NEA suggests that the growing demand for new health care facilities an the hospitality industry will lead to increased demand for interior designers. Explosive growth in online advertising and interactive multimedia serves as a boon for artists and illustrators, while the growth of digital publishing provide new opportunities for fledgling writers and authors.
Will students who dreamed of successful freelance artist careers be prepared for these professional private sector jobs? There is a widespread sense that humanities students assume their sophomore-year papers on phenomenological existentialism, and senior theses on race in post-Renaissance British literature, will translate into dream jobs. But in a 2004 article in The Atlantic, Richard Freeland anticipated a "third way" nestled between a liberal arts education and professional tutelage:
Slowly but surely, higher education is evolving a new paradigm for undergraduate study that erodes the long-standing divide between liberal and professional education. Many liberal arts colleges now offer courses and majors in professional fields; professional disciplines, meanwhile, have become more serious about the arts and sciences. Moreover, universities are encouraging students to include both liberal arts and professional coursework in their programs of study, while internships and other kinds of off-campus experience have gained widespread acceptance in both liberal and professional disciplines. Gradually taking shape is a curricular "third way" that systematically integrates liberal education, professional education, and off-campus experience to produce college graduates who are both well educated and well prepared for the workplace.
If the NEA's analysis is correct, the post-industrial economy may hasten the arrival of Freeland's "third way" in institutions of higher education by plugging independent artists into professional roles.
There is another, less positive reason why artistic jobs might grow faster than the general economy. They're cheap. As my colleague Derek Thompson noted, graduates with arts and humanities degrees are among the lowest median earners by major group, just above social workers and educators. In the early years of the recovery, some of the fastest growing positions are what economists call "McJobs," offering service work at long hours and barely livable wages.
Still, as 15 million unemployed Americans can attest, barely livable wages beat the alternative of having no job.The NEA's analysis is good news for creatives in all sectors -- and the parents that send them off to college praying they'll land on somebody's payroll.
Image: A street artist in Santa Monica, California (Sharon Mollerus/Flickr)
“Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage.”
Last Sunday the host of a popular news show asked me what it meant to lose my body. The host was broadcasting from Washington, D.C., and I was seated in a remote studio on the far west side of Manhattan. A satellite closed the miles between us, but no machinery could close the gap between her world and the world for which I had been summoned to speak. When the host asked me about my body, her face faded from the screen, and was replaced by a scroll of words, written by me earlier that week.
The host read these words for the audience, and when she finished she turned to the subject of my body, although she did not mention it specifically. But by now I am accustomed to intelligent people asking about the condition of my body without realizing the nature of their request. Specifically, the host wished to know why I felt that white America’s progress, or rather the progress of those Americans who believe that they are white, was built on looting and violence. Hearing this, I felt an old and indistinct sadness well up in me. The answer to this question is the record of the believers themselves. The answer is American history.
As the world frets over Greece, a separate crisis looms in China.
This summer has not been calm for the global economy. In Europe, a Greek referendum this Sunday may determine whether the country will remain in the eurozone. In North America, meanwhile, the governor of Puerto Rico claimed last week that the island would be unable to pay off its debts, raising unsettling questions about the health of American municipal bonds.
But the season’s biggest economic crisis may be occurring in Asia, where shares in China’s two major stock exchanges have nosedived in the past three weeks. Since June 12, the Shanghai stock exchange has lost 24 percent of its value, while the damage in the southern city of Shenzhen has been even greater at 30 percent. The tumble has already wiped out more than $2.4 trillion in wealth—a figure roughly 10 times the size of Greece’s economy.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
An attorney who helped players file a gender-discrimination lawsuit over artificial turf in the World Cup proposes a way forward for the sport.
On Sunday, players from the U.S. and Japan’s women’s soccer teams will step onto the field in Vancouver to compete for the sport’s greatest achievement: the World Cup. But perhaps the bigger battle—one that started well before the final match and will continue well after—isn’t about a trophy or national glory. Women’s soccer teams have long fought for recognition and respect not just from the public, but also from the male organizers of the sport, and it’s a struggle symbolized by the very fields they’ve been playing on.
The co-hosts of the World Cup—FIFA and the Canadian Soccer Association—failed to stage this year’s tournament to be played on real grass like every other World Cup previously, mandating that it be played on artificial turf instead. This is despite the dangers and inconveniences plastic turf poses. The synthetic pitches bake in the sun, with surface temperatures sometimes reaching 120 degrees. Clouds of rubber pebbles fly into players’ eyes, and the turf makes it difficult for the women to gauge the way the ball will bounce.
The author of Between the World and Me asks readers to submit their own experiences with racism and its physical consequences.
It was not until perhaps the fourth or fifth draft of my forthcoming book, Between The World And Me, that it took the form of a letter. The reason for the change was literary—without any sense of who it was meant to address. Before then I had a somewhat messy vaguely linear blob of an essay. But once I angled it to my son, once I knew my audience, the train began to run on the track.
I made this decision with some hesitation. "The Talk”—a conversation between black parents and their children about, but not limited to, the dangers of police brutality—has begun to ooze with sentiment and melodrama. I find myself, now, shuddering at the phrase. And yet there is something real there, something of value. My hope was to take the concept of "The Talk" and strip it of sentiment, make it visceral, ground it in the physical lives of black people.