Artists and creative types will experience high growth through 2018, according to new report from the National Endowment for the Arts.
If you can't imagine doing anything outside of the arts, you're in luck. Designers, writers, painters, and other creatives have a surprisingly bright future, according to a new report from the Bureau of Labor Statistics commissioned by the National Endowment for the Arts.
Jobs for artists are projected to grow by 11 percent this decade, a bit faster than the overall workforce. The BLS category "artist occupations" is a catch-all for actors, architects, dancers, designers, photographers, and other cultural workers like interpreters and archivists.
Some of these so-called "creatives" will enjoy explosive growth. The NEA predicts that museum technicians and conservators will grow the most (by 26 percent) between 2008 and 2018, followed by curators (23 percent), interior designers (19 percent), architects (16 percent), writers and authors (15 percent) and actors (13 percent).
As austerity descends on the states, theater and art budgets are fighting to keep taxpayer support. So where is the growth coming from? It's coming from the private sector, where decidedly un-artistic occupations like engineering, technology, and health care are siphoning artists from the freelance world. For example, the NEA suggests that the growing demand for new health care facilities an the hospitality industry will lead to increased demand for interior designers. Explosive growth in online advertising and interactive multimedia serves as a boon for artists and illustrators, while the growth of digital publishing provide new opportunities for fledgling writers and authors.
Will students who dreamed of successful freelance artist careers be prepared for these professional private sector jobs? There is a widespread sense that humanities students assume their sophomore-year papers on phenomenological existentialism, and senior theses on race in post-Renaissance British literature, will translate into dream jobs. But in a 2004 article in The Atlantic, Richard Freeland anticipated a "third way" nestled between a liberal arts education and professional tutelage:
Slowly but surely, higher education is evolving a new paradigm for undergraduate study that erodes the long-standing divide between liberal and professional education. Many liberal arts colleges now offer courses and majors in professional fields; professional disciplines, meanwhile, have become more serious about the arts and sciences. Moreover, universities are encouraging students to include both liberal arts and professional coursework in their programs of study, while internships and other kinds of off-campus experience have gained widespread acceptance in both liberal and professional disciplines. Gradually taking shape is a curricular "third way" that systematically integrates liberal education, professional education, and off-campus experience to produce college graduates who are both well educated and well prepared for the workplace.
If the NEA's analysis is correct, the post-industrial economy may hasten the arrival of Freeland's "third way" in institutions of higher education by plugging independent artists into professional roles.
There is another, less positive reason why artistic jobs might grow faster than the general economy. They're cheap. As my colleague Derek Thompson noted, graduates with arts and humanities degrees are among the lowest median earners by major group, just above social workers and educators. In the early years of the recovery, some of the fastest growing positions are what economists call "McJobs," offering service work at long hours and barely livable wages.
Still, as 15 million unemployed Americans can attest, barely livable wages beat the alternative of having no job.The NEA's analysis is good news for creatives in all sectors -- and the parents that send them off to college praying they'll land on somebody's payroll.
Image: A street artist in Santa Monica, California (Sharon Mollerus/Flickr)
A new anatomical understanding of how movement controls the body’s stress response system
Elite tennis players have an uncanny ability to clear their heads after making errors. They constantly move on and start fresh for the next point. They can’t afford to dwell on mistakes.
Peter Strick is not a professional tennis player. He’s a distinguished professor and chair of the department of neurobiology at the University of Pittsburgh Brain Institute. He’s the sort of person to dwell on mistakes, however small.
“My kids would tell me, dad, you ought to take up pilates. Do some yoga,” he said. “But I’d say, as far as I’m concerned, there's no scientific evidence that this is going to help me.”
Still, the meticulous skeptic espoused more of a tennis approach to dealing with stressful situations: Just teach yourself to move on. Of course there is evidence that ties practicing yoga to good health, but not the sort that convinced Strick. Studies show correlations between the two, but he needed a physiological mechanism to explain the relationship. Vague conjecture that yoga “decreases stress” wasn’t sufficient. How? Simply by distracting the mind?
City dwellers spend nearly every moment of every day awash in wi-fi signals. Homes, streets, businesses, and office buildings are constantly blasting wireless signals every which way for the benefit of nearby phones, tablets, laptops, wearables, and other connected paraphernalia.
When those devices connect to a router, they send requests for information—a weather forecast, the latest sports scores, a news article—and, in turn, receive that data, all over the air. As it communicates with the devices, the router is also gathering information about how its signals are traveling through the air, and whether they’re being disrupted by obstacles or interference. With that data, the router can make small adjustments to communicate more reliably with the devices it’s connected to.
No one will ever find a closer exoplanet—now the race is on to see if there is life on its surface.
One hundred and one years ago this October, a Scottish astronomer named Robert Innes pointed a camera at a grouping of stars near the Southern Cross, the defining feature of the night skies above his adopted Johannesburg. He was looking for a small companion to Alpha Centauri, our closest neighboring star system.
Hunched over glass photographic plates, Innes teased out a signal. Across five years of images, a small, faint star moved, wiggling on the sky. It shifted just as much as Alpha Centauri, suggesting its fate was intertwined with that binary system. But this small star was closer to the sun than Alpha. Innes suggested calling it Proxima Centauri, using the Latin word for “nearest.”
The dim red star soon entered the collective imagination, inspiring dreams of interstellar travel. Gravity has linked the star to the Alpha Centauri system, but our culture of science and storytelling has linked it to the solar system. Today, that link will grow stronger, when an international team of astronomers announces that this nearest of stars also hosts the closest exoplanet, one that might look a whole lot like Earth.
Do mission-driven organizations with tight budgets have any choice but to demand long, unpaid hours of their staffs?
Earlier this year, at the encouragement of President Obama, the Department of Labor finalized the most significant update to the federal rules on overtime in decades. The new rules will more than double the salary threshold for guaranteed overtime pay, from about $23,000 to $47,476. Once the rules go into effect this December, millions of employees who make less than that will be guaranteed overtime pay under the law when they work more than 40 hours a week.
Unsurprisingly, some business lobbies and conservatives disparaged the rule as unduly burdensome. But pushback also came from what might have been an unexpected source: a progressive nonprofit called the U.S. Public Interest Research Group (PIRG). “Doubling the minimum salary to $47,476 is especially unrealistic for non-profit, cause-oriented organizations,” U.S. PIRG said in a statement. “[T]o cover higher staffing costs forced upon us under the rule, we will be forced to hire fewer staff and limit the hours those staff can work—all while the well-funded special interests that we're up against will simply spend more.”
If Hillary Clinton beats Donald Trump, her party will have set a record in American politics.
If Donald Trump can’t erase Hillary Clinton’s lead in the presidential race, the Republican Party will cross an ominous milestone—and confront some agonizing choices. Democrats have won the popular vote in five of the six presidential elections since 1992. (In 2000, Al Gore won the popular vote but lost the Electoral College and the White House to George W. Bush.) If Clinton maintains her consistent advantage in national and swing-state polls through Election Day, that means Democrats will have won the popular vote in six of the past seven presidential campaigns.
Since the 1828 election of Andrew Jackson that historians consider the birth of the modern two-party system, no party has ever won the presidential popular vote six times over seven elections. Even the nation’s most successful political figures have fallen short of that standard.
A recent scholarly paper on “microaggressions” uses them to chart the ascendance of a new moral code in American life.
Last fall at Oberlin College, a talk held as part of Latino Heritage Month was scheduled on the same evening that intramural soccer games were held. As a result, soccer players communicated by email about their respective plans. “Hey, that talk looks pretty great,” a white student wrote to a Hispanic student, “but on the off chance you aren’t going or would rather play futbol instead the club team wants to go!!”
Unbeknownst to the white student, the Hispanic student was offended by the email. And her response signals the rise of a new moral culture America.
When conflicts occur, sociologists Bradley Campbell and Jason Manning observe in an insightful new scholarly paper, aggrieved parties can respond in any number of ways. In honor cultures like the Old West or the street gangs of West Side Story, they might engage in a duel or physical fight. In dignity cultures, like the ones that prevailed in Western countries during the 19th and 20th Centuries, “insults might provoke offense, but they no longer have the same importance as a way of establishing or destroying a reputation for bravery,” they write. “When intolerable conflicts do arise, dignity cultures prescribe direct but non-violent actions.”
This much is obvious: Young people don’t buy homes like they used to.
In the aftermath of the recession and weak recovery, the share of 18- to- 34 year olds—a.k.a.: Millennials—who own a home has fallen to a 30-year low. For the first time on record going back more than a century, young people are now more likely to live with their parents than with a spouse.
It’s become en vogue to argue that young people’s turn against homeownership might be a good thing. After all, houses are not always dependable investment vehicles, a lesson the country learned all too painfully after the Great Recession. Without being anchored to any one city from their mid-20s and into their 30s, young people who don’t own are free to roam about the country in search of the best jobs. What’s more, given the copious advantages of a college degree in this economy, perhaps many young people could be commended for investing in their intelligence, professional networks, and abilities rather than devote that same income to a roof, floor, and furniture.
In his 1945 article “The Economic Organisation of a P.O.W. Camp,” the British economist R.A. Radford recounted his experiences within the informal system of barter among prisoners at Stalag VII-A, a German camp on the outskirts of Munich during the Second World War. Using supplies delivered by the Red Cross, some prisoners moved between different nationalities’ encampments, buying tea for cheap from the French (who tended to prefer coffee) and then selling it to the British (whose affection for tea is a matter of centuries-old lore). Meanwhile, imprisoned Gurkha soldiers from South Asia sought out tins of vegetables and bartered them for corned beef.
Radford’s account, which was compellingly retold by Ray Fisman and Tim Sullivan in their recent book The Inner Lives of Markets, explains that, in the absence of paper money, prisoners had to pick another currency to enable their transactions: cigarettes. “A ration of margarine might be bought for seven cigarettes, the equivalent, for instance, of one and a half chocolate bars, and so on,” Fisman and Sullivan write. “For the most part, prices were well known and consistent among the camp’s many huts that acted as local markets.”
The previous 84 items in this series cover developments that might concern Donald Trump’s opponents. Tonight we have one that might concern his most fervent supporters.
In an interview aired Wednesday evening with his supporter Sean Hannity, Trump showed that he understood the logic behind immigration-reform proposals like that of Marco Rubio’s “Gang of 8.” The starting point for such proposals has been the reality that millions of people are already in the United States without legal permission. Some are ordinary criminals, who if they’re caught are usually jailed or deported. But many others are parents, students, workers, or others who lead regular law-abiding lives except for their illegal immigration status.