This was Treasury Secretary Jack Lew's awesome signature in January.
This is Lew's depressingly ordinary (and still illegible) signature in June.
Obama's second term continues to disappoint in new and surprising ways.
Derek Thompson is a senior editor at The Atlantic, where he oversees business coverage for TheAtlantic.com. More
Thompson has written for Slate, BusinessWeek, and the Daily Beast. He has also appeared as a guest on radio and television networks, including NPR, the BBC, CNBC, and MSNBC.
This was Treasury Secretary Jack Lew's awesome signature in January.
This is Lew's depressingly ordinary (and still illegible) signature in June.
Obama's second term continues to disappoint in new and surprising ways.
It's an indication of our great fortune, our great decadence, or both, that when you type the word "viral" in a Google search bar, the first result isn't an influenza or meningitis. It's videos.
While the former two subjects capture the attention of real scientists, the secrets of virality have captured the less-than-purely-scientific attention of marketers. These sociologists of the social Web claim they can distill cultures of cat GIFs and quirky videos to their essence and extract the very thing that made them infectious in the first place.
There are cottage industries for every strand of self-improvement, so naturally there is a healthy supply of scientists and marketers who are happy to share with you the "secrets" of making stuff "go viral." Jonah Berger's new book Contagious: Why Things Catch On is a hallmark of the genre -- a younger cousin of Made to Stick -- and as a collection of anecdotes, it's perfectly interesting. Berger boils down his theory of stickiness to a portable acronym, STEPPS, which starts with Social currency (people share things that make them look good) and Triggers ("top of mind, top of tongue") and ends with Stories. He explains, most memorably, that a study of the New York Times most-emailed list found that education and health topics dominated the top. Stories with gosh-wow revelations about the world trigger "physiological arousal," he said, putting readers in a mood to share their awe with friends.
Then there is Thales S. Teixeira, an assistant professor of marketing with Harvard Business School, who recently published an essay, "The Key to Viral Videos." In one study, he watched participants watching YouTube ads, registering their facial reactions. "The data showed that evoking surprise was the best way to attract attention," he concluded, "while evoking continuous moments of joy was the best way of retaining it." Thumbing through the literature of virality, you'll find enough quotable observations to make for a fairly interesting hour at the bar: Limiting invites to a social media app makes the app seem more desirable (scarcity goes viral); products that look distinct are their own advertisements (unique goes viral); Mars bar sales perked up during the Mars Pathfinder mission (triggers go viral).
But do these insights really form a blueprint for making awesomely popular stuff, or do they simply identify stuff that turned out to be awesomely popular and pretend that such serendipity can be reverse-engineered? People tend to share videos that make them feel good, Berger points out. That's sounds right, but how do you explain the phenomena of scary chain letters, the power of negative headlines (just ask any journalist.), or Arab Spring? The most sticky content has "practical value," Berger says. But the most popular non-musical video in YouTube history is "Charlie Bit My Finger," which isn't a guide to anything except how funny British kids sound when they're crying.
The problem with most of the viral secrets isn't that they're obviously
wrong. It's that they're obvious. So obvious, in fact, that they betray
the very idea of a secret. Take the conclusion of the NYT most-emailed list. "Middle-aged parents with school-aged kids interested in health and education stories" is a tautology.
The idea that scarce things are valuable is the foundational concept of
economics; anything that explains the basic markets for oil, wheat and
diamond rings isn't classified information. It's true that stories help us remember
important ideas, but that revelation is as old as the Bible. (And,
honestly, if you try to make the argument that the New Testament went
viral, you deserve whatever biblical punishment you get.)
I'm not suggesting the Berger and Teixeira are simple-minded, just that their conclusions are simple. If the common elements of viral contents are rudimentary, then the secret to virality must be very mysterious indeed, since the vast majority of stuff doesn't get millions of hits of YouTube.
There is really only one secret to virality. Hollywood knows it. Music studios know it. Publishing houses know it. BuzzFeed knows it. You probably know it, too. (In other words, it isn't really a secret, either.) It's ... the past.
A year ago, Slate's Farhad Manjoo wondered how BuzzFeed made so many sensationally popular stories. To his dismay, he learned that its writers weren't baking from scratch but rather adding icing to other people's cakes. Almost every big hit at BuzzFeed seemed to be an extension of a successful article from somewhere else on the Web (and usually Reddit). "The secret to [BuzzFeed's] viral success is to find stuff that's already a minor viral success and make it better," Manjoo wrote. "Repeat the process enough, and you're bound to get a few mega-hits.
But as I pointed out last year, that's not a shameful way to become a hits machine. Learning from the past is the only way to become a reliable hits machine. Look at Hollywood, where the top 39 movie openings in history are all sequels, reboots, and adaptations. Look at the music industry, where one study showed that today's hit pop songs are more similar in terms of chord progressions and texture than any time since the 1950s. Look at the publishing industry, where imprints bank on what books sold last month and last year (like "little books about big ideas" or one-word non-fiction books about social sciences) and try to pump out more of the same to cover their budgets. Learning from the past and trying to recreate it with a twist isn't a secret. It's just what professionals in hit-making businesses do.
Everywhere, we see that "the secret sauce" is a tweak of some popular thing that came before it. This is even the case for actual secret sauces. Forty years ago, somebody discovered a McDonalds Manager's Handbook written in 1969 and published the top-secret recipe for the pre-made sauce used on Big Macs. The instructions are now all over the Web. If you're wondering: It's basically a dressed-up thousand island dressing.
Nobody discovered the secret of sauce, after all. A couple chefs had simply picked a perfectly good sauce that already existed in almost every fridge in America and added sugar and mayo. Now these were people who understood the elements of virality. Too bad they didn't made a video.
Let's talk about your next click.
At The Atlantic, we're always interested in understanding as much as we can about how you read us. What sort of headlines do the best traffic? How can we encourage you to finish a story and promote it on social media? Particularly challenging is getting readers to click on more Atlantic articles.
So we've put the question to our analytics team: When somebody clicks on an Atlantic article, where does she go next?
It turns out that, despite our best efforts to tease headlines at the top of the page and all along the right rail, the most popular "second click" is all the way toward the bottom. It's the "Most Popular" box listing the most-read articles on the site. Even after we redesigned our article pages and moved the location of the box, our data still showed that more people were clicking one of the ten articles in the "Most Popular" list than any other module on the article page.
This is a little weird. The Most Popular Box is practically buried on our article pages below the last word most people read. It often reflects old articles pinging around email long after the news is fresh. Don't you guys want our new stuff? Apparently not.
And yet, I'm as guilty as anybody of navigating by lists. I gravitate toward the "Most Popular" boxes on the New York Times, Slate, the New Yorker. I go to sites like Reddit and Digg specifically to learn what other people are clicking. I use Twitter as my homepage of news precisely because I'm more interested in what people are actually reading than in what institutional homepages are interested in selling.
I read (and, I'm guessing, you read) "most popular" boxes not only because we're savvy, but also because we're lazy. Where's the best stuff? is the question that motivates my Internet snooping. But that's actually a hard if not impossible question to answer. An easier question to answer is: What's everybody else reading? Hence, the power of the "Most Popular" box. Why drive yourself around a website when you can let the Internet's eyeballs drive for you?
The Internet didn't invent top-ten lists. In the markets for books, music, and movies, consumers navigate a complicated world of abundance by seeking lists and reviews to limit and order our options. But what effect does leading-by-lists have on our opinions of those books, music, movies, and experiences?
Sociologists Matt Salganik and Duncan Watts ran an experiment with music. Subjects received 48 songs listed in order of "most downloaded." They listened to the songs and downloaded their favorites.
There was a catch, naturally. One group saw the true rankings of the most-downloaded songs. The other group saw the rankings reversed, with the most popular song at the bottom and the least popular song at the top. It turned out that subjects with the reversed list downloaded the least popular song the most. The mistaken belief that a song was popular -- even if the song was demonstrably unpopular! -- made it most likely to be downloaded.
Essentially, top-ten lists have a kind of placebo effect, the researchers concluded,. They don't just tell us what to read or listen to. They tell us what to like. And they make us think we like something just because we saw it next to the number 1.
If two sociologists can persuade their subjects that an unpopular song is a great song by putting a "1" next to it, you can imagine similar duplicity in the real world having significantly lager consequences. Yelp famously responded to a flood of fake reviews that were encouraging people to go to bad restaurants with false positive ratings. For the millions of Americans who orient their reading habits by top-ten lists, it means that buying books in bulk to force them onto a NYT bestseller list is the most effective form of advertising. Mobile apps have an incentive to cheat their way onto the front page of the App Store, since many app shoppers hardly make it to page two.
It's well understood that lists and rankings can be fixed. But Salganik and Watts' research makes a bigger claim: That fixed rankings can dupe us into liking things that we wouldn't have liked if they hadn't been ranked more highly. The placebo effect of most-popular lists suggests that better-reviewed meals might actually taste better; more-downloaded songs might actually sound better; articles with more Facebook likes might actually feel more delightful to read. When we outsource our navigation of the world to other peoples' opinions, we lose, in a small way, our ability to individually evaluate the quality of our experience.
Maybe that's a perfectly acceptable risk in exchange for the convenience of using simple lists to orient ourselves in a universe of stuff. Still it's sobering to think how much we're leaving to the "wisdom" of the crowd.
Self-control is hard. People tend to go for "smaller-sooner" benefits over "larger-later" rewards. Economists call this "hyperbolic discounting." You call it "I'll go to the gym next Tuesday." Nothing encapsulates procrastination better than snoozing your alarm for 30 minutes because it's easier to lie in bed than think about the costs of starting work late.
But here's an economic punishment that will wake you up in the morning: An alarm clock that shreds a dollar bill if you don't smack it in the first few seconds.
Utter genius. And actually in keeping with most research on how to pull forward the punishment from delayed costs!
Now, you might not wake up every morning thinking about hyperbolic discounting. Lucky you. But for the sleep-deprived economist in your life, this is the gift that says: "Wake the &*$# up and put your money where current hyperbolic discounting research is."
This week, Amazon announced plans to bring online grocery shopping to Los Angeles with a new service called Amazon Prime Fresh. For $79, Amazon Prime already gives customers two-day shipping and access to thousands of streaming movies and TV shows. But for a $299 annual fee, Prime Fresh members can order fresh food from their couch from Amazon and expect to pick up groceries at the door in a matter of hours.
Amazon CEO Jeff Bezos has already trained Wall Street to expect that little he does will turn a profit for years, so don't expect Prime Fresh to make much money. At least for now. This move continues Amazon's assault on retail, where even giants like WalMart and Costco have announced that online grocery delivery is a nearly impossible business. Amazon's message is: Impossible for you, maybe...
A $300 subscription to Amazon Prime Fresh doesn't just buy access; it also binds shoppers to Amazon as their overwhelming source of all Internet shopping. "It will help to make Amazon the starting point for online purchases -- more than it already was -- and give consumers even less of a reason to shop anywhere else," Morningstar equity analyst R.J. Hottovy said. Being the starting point for online purchases is everything: Google's biggest source of online advertising comes from searches with a shopping intent. Why look anywhere else when only Amazon will get it to you today?
As Farhad Manjoo wrote at Slate, Amazon's grand infrastructure strategy has visibly shifted. At first, it was: set up distribution centers in cheap states and ship to where the people are. Now the company is buying warehouses in the largest cities so that once you click BUY, online orders can go from their distribution centers to your doorstep in hours, not in days. More from Farhad:
If Amazon can send me stuff overnight for free without a distribution center nearby, it's not hard to guess what it can do once it has lots of warehouses within driving distance of my house. Instead of surprising me by getting something to me the next day, I suspect that, over the next few years, next-day service will become its default shipping method on most of its items ... Getting something shipped to your house offers gratification that's even more instant: Order something in the morning and get it later in the day, without doing anything else. Why would you ever shop anywhere else?
This ambitious strategy would be sort of insane for any large-cap company whose shareholders care about quarterly earnings. That is to say: any company except Bezos'. But Amazon is already so dominant in cloud and e-retail that shareholders expect the company will eventually flip a switch on its infrastructure quasi-monopoly, and voila, profits forever.
Today's most famous infrastructure quasi-monopolies in the private sector are probably the cable companies. Laying cable is hella-expensive for both legal and material reasons (Verizon abandoned its nationwide projects after covering less than 20 percent of the country), cable companies can charge such a mark-up on the communications bundle because they have a massive infrastructure advantage in a high-barrier industry.
Ditto Amazon, which is building a bundle of its own. Fresh Prime offers a unique package of services that takes advantage of the company's lead in digital and physical infrastructure: infinite books, fast shipping, fresh groceries, free streaming. Who in the world would try to build a competitor to this strange amalgam of hugely expensive and hardly profitable services?
No one. And, for Bezos, that is precisely the point.
Alan Krueger's thoroughly entertaining economic speech (that phrase is not an oxymoron.) at the Rock and Roll Hall of Fame (that address is not a typo.) is a potpourri of cool factoids, but the bottom line is that if you want to understand economic inequality in the U.S., start with the music industry.
Here's the evolution of the "winner-take-all" music biz, where the top 1 percent of artists have more than doubled their share of ticket revenue since 1982 ...
... and here's the evolution of the "winner-take-all" American economy, where the top 1 percent of earners have doubled their share of national income, too.
The simplest way to explain both trends in the same breath is to say that globalization and technology have conspired to give the world unprecedented access to the best stuff (songs, socks, smartphones). This gives the best producers (of songs, socks, and smartphones) access to more wallets than ever. And that helps the folks behind the world's best songs, socks, and smartphones use their best-in-class status to gobble up more money than ever before.
But the most interesting way that the music industry teaches us about the overall economy isn't income inequality, exactly. It's duplicability.
Once you can copy something, its price goes to zero. You can copy a song file. You can copy a video of a song file. These things aren't unique. But a live concert is. So look what's happened to prices. In the last 30 years, listening to music has become cheaper than ever, while watching live performances has grown more expensive. "The price of the average concert ticket increased by nearly 400% from 1981 to 2012," Krueger said, more than twice the rate of inflation.
In many ways, the middle class jobs crisis in the last half-century has been a crisis of replicability. Last century, the pool of manufacturing workers for U.S. companies was limited by the bounds of the contiguous United States. They made decent wages. In the last 30 years, those same companies have learned that Chinese people, Vietnamese people, and varieties of robots perform the same tasks for less money. As the labor pool doubled and doubled, manufacturing work in the U.S. disintegrated.
The same way that concerts (i.e.: unique, local music events that can't be duplicated) have come to dominate the music business, local "non-tradable" industries have come to dominate job creation in the last generation. As I've written, about half of the net jobs created between 1990 and 2008 were in education, health care, and
government -- local industries shielded from the duplicative forces of globalization and technology, since we don't visit doctors in China or take Econometrics from robots (yet).
The cashless society -- a world where physical money is practically obsolete -- has, in just a few years, gone from a utopian dream to something like an inevitability. In Sweden, a national effort is underway to take the country cashless within two decades. Throughout Africa, it's perfectly common for merchants to accept money through mobile phones by having buyers transfer a specific amount of money to a specific number associated with the merchant.
In the U.S., the road to cashlessness is paved in plastic (glass, too). In the 1970s, fewer than 20 percent of the adult population owned a credit card. Today, between 70 and 80 percent of the adult population does. In some cities, being forced to pay with cash already feels like a precious anachronism ("What do you mean I have to count the money before extending my arm to the register?").
The world of economic research has tried to keep pace with the plastic revolution, producing hundreds of reports on how MasterCard, Visa, and AmEx change our relationship to money and ourselves. The logic of credit is fairly simple. People rarely spend exactly what they earn, exactly when they earn it. With savings, we pass today's earnings to the future. With credit, we pull expected future earnings into today.
The problem is that consumers (and perhaps Americans, in particular) aren't so good at either. We don't save much, and we're awful at projecting future earnings, spending far more than we're able to pay back quickly. Lower-income people, consumers who are worse at math, people
who self-report emotional instability, introversion, or materialism, have all been found to get into trouble with credit cards. Here are some more findings from the reams of credit card research -- and few of them are good.
Credit Cards Are Making You Irresponsible
The typical knock on credit cards is that they're too effective at letting us buy stuff. Cash and coins must be considered, handled, counted, organized, re-counted, negotiated into the small space of a palm, and delivered cleanly to a merchant. Each of these verbs represents an inconvenience -- a point of friction. But a card is just a card. Pull, swipe, finished. It's so easy to spend whatever we want.
Too easy, actually. Research has shown that people who own more credit cards spend more over all; more in specific
stores; more at restaurants; more on tips at restaurants ... literally, there are hundreds of studies on the effect of credit cards on spending, and the vast majority of them find that, all things equal, we put more on plastic.
In 2001, two business professors from MIT organized an auction for Boston Celtics tickets where one group bid with cash and one group bid with credit. The credit card group offered nearly twice as much for the tickets. "Framing hypothetical purchases as credit card payments may significantly increase likelihood of purchase and willingness to pay," the researchers wrote. They put their cheeky credit card advice right there in the headline: "Always Leave Home Without It."
Credit Cards Are Making You Forgetful
The downside of counting money is that it takes time and effort. The upside is that it takes time and effort. That makes it more memorable. Cards make us forget we're dealing with money. They create "an illusion of liquidity," wrote Dilip Soman, a professor at the University of Colorado at Boulder, that makes consumers confuse the ability to spend money and the means to spend money. When paying with plastic, buyers have a tendency to outsource their mindfulness to the card. As a result, they were less likely to remember details about their purchases and more likely to buy additional items.
Credit Cards Are Making You Fat
The "pain" of paying with cash has a hidden benefit. It makes it harder to quickly capitulate to indulgences. Credit card "weaken impulse control," Manoj Thomas, Kalpesh Kaushik Desai, and Satheeshkumar Seenivasan found in a 2011 paper published in the Journal of Consumer Research. "Consequently, consumers are more likely to buy unhealthy food products when they pay by credit card than when they pay in cash." Studying the contents of shopping baskets, the three economists found that shoppers with credit cards bought a larger share of food items they had ranked as unhealthy. In this way, the permissiveness of credit cards weakens consumers' judgment in more subtle ways than total amount spent.
Credit Cards Exacerbate Income Inequality
It's easy to see how credit cards might allow low-income families to spend more than they earn, allowing them to live a more comfortable upper-income life. But there are a few problems with that story. First, families can't outrun their actual earnings, and too often credit cards provide the illusion of a better life followed by the crushing reality of debt and costly penalties. More subtly, credit cards create a transfer of money from the poor to the rich by punishing non-credit-card consumers. In their paper "Who Gains and Who Loses from Credit Card Payments?" Scott Schuh, Oz Shy, and Joanna Stavins pointed out that credit cards incur merchant fees that show up in other prices. Unable to impose a surcharge penalty on credit card customers alone, merchants often raise prices for all customers. This creates higher costs for non-card-carrying (often low-income) shoppers. So, credit cards both mitigate income inequality in the short run and exacerbate it in the long run.
The idea that you can't buy happiness has been exposed as a myth, over and over. Richer countries are happier than poor countries. Richer people within richer countries are happier, too. The evidence is unequivocal: Money makes you happy. You just have to know what to do with it.
So what should you do with it?
Stop buying so much stuff, renowned psychologist Daniel Gilbert told me in an interview a few years ago, and try to spend more money on experiences. "We think that experiences can be fun but leave us with nothing to show for them," he said. "But that turns out to be a good thing." Happiness, for most people not named Sartre, is other people; and experiences are usually shared -- first when they happen and then again and again when we tell our friends.
On the other hand, objects wears out their welcome. If you really love a rug, you might buy it. The first few times you see, you might admire it, and feel happy. But over time, it will probably reveal itself to be just a rug. Try to remember the last time an old piece of furniture made you ecstatic. For me, at least, it's a difficult exercise. The wonder of my potted plants certainly wanes with time. "Psychologists call this habituation, economists call it declining marginal utility, and the rest of us call it marriage," Gilbert wrote in Stumbling on Happiness.
But there might be another reason why buying objects rather than experiences tends to disappoint. For the most materialistic people, there might be something dull -- even disappointing -- about the act of buying itself.
"Materialists are more likely to overspend and have credit problems, possibly because they believe that acquisitions will increase their happiness and change their lives in meaningful ways," Marsha L. Richins of the University of Missouri concludes in her new paper, "When Wanting Is Better Than Having," published this month in the Journal of Consumer Research. But in three separate studies, materialists reported significantly more happiness thinking about their purchase beforehand than they did from actually owning the thing they wanted.
"Thinking about acquisition provides momentary happiness boosts to materialistic people, and because they tend to think about acquisition a lot, such thoughts have the potential to provide frequent mood boosts," Richins wrote, "but the positive emotions associated with acquisition are short-lived. Although materialists still experience positive emotions after making a purchase, these emotions are less intense than before they actually acquire a product."
Once again, it would seem that experiences makes us happier than stuff -- even in the act of buying.
The finding that paying for something is less satisfying than wanting it shouldn't be confused with the idea that buying things makes us sad. It's hard to find a study showing that "retail therapy" (i.e.: shopping your way out of a bad mood) doesn't work; most research suggests that a well-timed excursion to the mall can lift one's spirits. But if Gilbert and Richins are right, then the bulk of the therapy provided by shopping is everything that happens before the check-out counter. You don't have to go into debt to achieve nearly the same emotional gains from materialism.
In my column for The Atlantic this month, Death of the Salesmen, I found that the retail space is generally divided between stores racing to the price bottom to attract lower-income consumers and stores clinging to the patina of a shopping experience to lure richer shoppers. Maybe those stores, and their customers, understand Richins' research, intuitively. When we're shopping, not for the things we need, but for the things we merely want, it's the experience of shopping and buying that makes us truly happy.
Booz Allen announced this morning that the company has officially fired Edward Snowden, the man behind the NSA leaks, for "violations of the firm's code of ethics."
Here is the complete statement (with new updates underlined):
Booz Allen can confirm that Edward Snowden, 29, was an employee of our firm for less than 3 months, assigned to a team in Hawaii. Snowden, who had a salary at the rate of $122,000, was terminated June 10, 2013 for violations of the firm's code of ethics and firm policy. News reports that this individual has claimed to have leaked classified information are shocking, and if accurate, this action represents a grave violation of the code of conduct and core values of our firm. We will work closely with our clients and authorities in their investigation of this matter.
Strangely, but probably not significantly, the company claims that Snowden's salary was $122,000, even though Snowden told reporters that he was making about $200,000
Also of potential interest: Glenn Greenwald, the Guardian journalist who broke the surveillance story, says he has been working with Snowden since February. If Booz only employed Snowden for "less than three months" before his termination, as they claim, that puts Snowden's first day at Booz some time in March.
The United States is the greatest country in the history of everything, if you just listen to its leaders, and a disgrace among developed countries, if you just read international surveys. Our health care system is famously expensive and inaccessible. Our education system is famously broken. Oh, and our income inequality? It's just famous.
The OECD Better Life Index, released last week, feeds the American instinct toward both jingoism and self-deprecation. In housing access and family wealth, it concludes that the U.S. really is the best country in the world. But we rank 28th among advanced nations in the category of "work-life balance," ninth from the bottom.
This raises a thorny question: If we're so rich, why are we working so hard that we don't even have time to cherish the fruits of our productivity?
There are some simple reasons why the U.S. places far below Scandinavia and other European countries among work-life metrics. We work longer hours to make all that money. So we have less down time. Also, we don't have national laws, like mandatory paternal leave, that alleviate the burden on working moms.
The surprising fact is that American leisure time has actually been increasing for most families for decades, and American men work less today, and have more down time, than ever recorded. Even if you consider that to be bad news (and many do), less work should improve just about any definition of work-life balance. Still, the most important reason why we rank barely above Mexico is the increase in single mothers who, in the U.S., face an extraordinary burden relative to their overseas counterparts.
Surprise: Leisure Time Is Growing (But Not For All)
Since 1950, personal hours worked have fallen dramatically all over the developed world, thanks to advances in overall productivity and the shift away from certain kinds of time-intensive manufacturing. They've fallen the most in European countries and the least in the U.S.
But those gross averages hide the fact that the workweek has undergone two parallel revolutions in the U.S: More paid work for women and less paid work for men. Hours worked by moms have doubled since 1960. Higher education attainment and the rise of the service sector has allowed many women to trade chores for paychecks, as this graph shows (data via Valerie Ramey).
In the meantime, men have picked up some of the slack at home. In the 20th century, the typical working woman's week hours rose by 230 percent; in parallel, men are doing about 370 percent more housework.
"Leisure" means different things to different people. But to economists it means time spend not working -- either the kind that involves doing chores or the kind that involves doing Excel. In the last century, lifelong leisure time in the U.S. has grown significantly, due to at least three factors: (1) the decline of the workweek, which most affected men; (2) technology making house work more efficient, which most affected women; and (3) people living longer in retirement, which affected both men and women.
You might think the increase in leisure would be highest among the rich, since nations have generally earned more leisure time as they've become more productive. But strangely, it is the least educated and poorest men who have seen the highest gains in leisure. This has created what economist Eric Hurst, among others, calls, "leisure inequality," which mirrors income inequality. Poor working men have more leisure time than ever, but the highest educated men have less downtime than they've had in 50 years.
The OECD's "work-life" balance measure counts long hours, leisure time, commuting time, satisfaction with job, and the employment rate of mothers with school-age children. Although the U.S. places near the bottom overall, it actually places among the top countries in commuting time and job satisfaction. And as you can see, we're making strides in overall leisure time as well. But the most important category where we fall far behind other rich countries is with mother -- especially single mothers.
The Single Mom Crisis
Women are the primary breadwinners in 40 percent of U.S. households today. But in most of those families, women are the primary earner because they are the only earner. One if four houses are now led by a single mom, who earn an average income of just $23,000.
Balancing work and leisure without a partner isn't easy no matter where you live, but single working mothers feel a particular pinch in the U.S., for two reasons. First, the U.S. has the fourth-highest share of single mothers in the OED. Second, they work the longest hours and have more children than most rich countries, according to a study of family time. "Lone mothers in the US have less available time than lone mothers in any of the other countries" the researchers studied.
Single mothers are more likely to work than the average adult -- after all, the vast majority of them simply must -- but they're also more likely to work less. In the U.S., where single mothers work the most, only 4 percent punch in more than 50 hours a week.
So when you hear that American work-life balance ranks poorly, remember that there really isn't any such thing as "American work-life balance." Instead there are intersecting trends -- only a handful of which I've touched on here -- showing that, although the workweek has fallen, the changing composition of families has put tremendous time-stresses on more mothers. Overall, research shows that lower-income men have never had more downtime, while working single mothers have never been more common. The first part is a problem. The second is a crisis.
Students are used to thinking of college as a requirement, or a career accelerator, or four-year party. But maybe it's best to think of it as a straightforward investment.
Before making an investment in a stock or a house, you would do research. You would consider the costs weighed against future returns. You would know there are no guarantees. It should be the same with college. Like buying a house, the most important question isn't the total price, but whether you can afford to pay it off in the long run.
For most students, the college investment "appears to pay off," according to Philip Oreopoulos and Uros Petronijevic in a wonderful new study of studies on the benefits of higher education. Their lit review covers a lot of ground, from the growth of student debt to the most lucrative majors. Here are some of their major findings, gleaned from reams of research from the past 40 years, organized in an easy click-print-and-paste FAQ chart for young people thinking about more school -- and older people debating whether college is still worth it.
Europe's job market is a historic disaster.
The EU unemployment rate set a new all-time high of 12.2 percent, according to today's estimates. But it's the youth unemployment crisis that's truly terrifying. In Spain, unemployment surged past 56 percent, and Greece now leads the rich world with an astonishing 62.5 percent of its youth workforce out of a job (graph via James Plunket).
My God, look at Greece's trajectory. That thing isn't slowing down. Since April 2012, Greek youth unemployment has grown by about one percentage point a month. At that rate, it would pass 70 percent in early 2014.
It is suddenly not insane to imagine a youth unemployment rate of 70 percent in the developed world. And that is insane.
It should be noted that some people consider youth unemployment figures a bit hyperbolic. They prefer measures like "youth unemployment ratio, which takes the share of young people who are looking for work but can't find it and divides it by the entire population. Last year, the EU's youth unemployment ratio was 9.7 percent , less than half the youth unemployment rate of 23 percent.
But even the ratio fails to account for the millions of young people who have all but given up in their awful economies. There are 26 million young people in rich countries who are as "NEETS" (Not Employed, or in Education, or Training), according to the OECD.
Youth unemployment is bad for all the obvious reasons, including the big loss to future productivity and earnings. But Europe's youth unemployment is strange, because we've never seen a generation *this educated* also be this unemployed. Nearly 40 percent of Spain's 20-and early-30-somethings are college educated. In Greece, it's 30 percent. Europe's crisis -- clearly worsened by its austerity obsession -- is an absurd waste of the most educated generation in the continent's history.
Oh. My. God. Let's treat this Ron Burgundy moment with some data.
"I'm so used to liberals telling conservatives that they're anti-science. But liberals who defend this and say it is not a bad thing are very anti-science. When you look at biology -- when you look at the natural world -- the roles of a male and a female in society and in other animals, the male typically is the dominant role. The female, it's not antithesis, or it's not competing, it's a complementary role. We're lost the ability to have complementary relationships ... and it's tearing us apart."
Here's the thing about this chart. This isn't a picture of the "unnatural" world that Erickson fears. This is the natural world! If anything, the unnatural world is the one where law deprives women of the right to vote until 1920 and where we discourage women from working alongside men or doing anything besides raising kids and cooking dinner.
It might be the most famous statistic about female workers in the United States: Women earn "only 72 percent as much as their male counterparts."
It's also famously false.
A new survey from PayScale this morning finds that the wage gap nearly evaporates when you control for occupation and experience among the most common jobs, especially among less experienced workers. It is only as careers advance, they found, that men outpaced female earnings as they made their way toward the executive suite.
So, women aren't starting off behind their male counterparts, so much as they're choosing different jobs and losing ground later in their careers.
The irony is that as women advance in their own careers, they might be more likely to fall behind, but they are also more likely to negotiate. That popular refrain that women don't know how to ask for a raise? That's bunk, too, the researchers concluded. Nearly a third of women -- and 29 percent of men -- have asked for raises, and even more female executives have done the same. In female-dominated sectors like health care and education more, half of women have negotiated for salary, benefits, or a promotion .
Still, inequalities persist. Comparing men and women job-by-job conceals the fact that men still dominate many of the highest-paying jobs. PayScale studied more than 120 occupation categories, from "machinist" to "dietician." Nine of the ten lowest-paying jobs (e.g.: child-care worker, library assistant) were disproportionately female. Nine of the ten highest-paying jobs (e.g.: software architect, psychiatrist) were majority male. Nurse anesthetist was the best-paid position held mostly by women; but an estimated 69 percent of better-paid anesthesiologists were male.
The highest-paid job in PayScale's controlled set is anesthesiologists, who are 69 percent male and 31 percent female -- creating a 38 percent percentage-point "jobs gap." Here is the jobs gap for the ten highest-paid positions.
PayScale's study is a necessary chaser to BLS and Census data,
because the government "compares all weekly earnings, even though women
and men do different things," said PayScale chief economist Katie
Bardaro. "We're trying to compare men and women with the same education,
same management responsibilities, similar employers, in companies with a
similar number of employees." After controlling for these factors, "the
gender wage gap disappears for most positions," she said.
In one job, they had enough data to show a statistically significant wage advantage for female workers. That is "dental hygienist."
But even if the gender gap disappears after controlling for experience and job selection, it's hard to imagine that men thoroughly dominating the highest-paying positions is a good outcome. For example, the expectation that women more than men bear the responsibility to raise children gently nudges thousands of highly educated women out of full-time work.
There is a wage difference. But it might not be the wage difference that you thought. The real gap isn't between men and women doing the same job. The real gap is between men and women doing different jobs and following different careers.
That gap should continue to tighten. Women have earned the majority of bachelor's degrees for the last few years. They're well-positioned to benefit from a growing professional service economy, and working moms are already the primary
breadwinners in 40 percent of households with kids, an all-time high. But if women are more likely to go into health care than manufacturing, more likely to work in human resources than software, and more likely to leave their careers early to start a family, the gaps will persist.
Ideally, some day soon, it won't take a statistical "control" to show that men and women are fundamental equal partners -- and equal competitors -- in the work force. It will just be the obvious truth.
Tax expenditures are funny, They're not taxes, exactly, because they save us money. They're not spending, exactly, because the dollars are never actually spent. They're somewhere in between. So think of it as tax spending.
Or just think of it as the ultimate nudge. The carrot hiding behind the tax code's big stick, tax spending guides us by making certain behaviors and actions cheaper. We encourage employers to provide health care by taxing wages and not taxing health benefits. We encourage investing by making a dollar earned from dividends cheaper than a dollar earned from a salary.
And as the CBO reports in a new study today, Washington's tax spending budget -- comprised of everything from mortgage deductions to the child tax credit to lower tax rates on capital gains -- is so massive, it's technically larger than Medicare, Defense, or Social Security. The tax spending budget is equal to 1/17th of the US economy.
Like the federal budget, the tax spending budget isn't all bad or all good. It's a collage of interests lurking in the shadow of the tax code that represents all factions, including large corporations, small corporations, institutional investors, low-income families, and every slice of America you can name. Conservatives rail against big government, where "bigness" is synonymous with the size of the federal budget. But almost as much as the spending budget, the tax-spending budget, whose number is rarely quoted, influences the economy nearly as powerfully, allowing government to promote relatively expensive housing and generous employer health care. It's the big prod.
Every year, Mary Meeker and the team from KPCB unleash upon the world the mother of all slideshows, which aims to sum up The State of the Internet. This year's behemoth was born this morning, weighing in at 117 pages. Here are the 12 most interesting pages. Check out the full report here.
(1) America's Media Attention in 1 Graph. Americans spend just six percent of their media diet with print, but those pages attract 23 percent of all ad spending. In mobile, the trend is the polar opposite. I don't know if this is worse news for the print industry (where you'd think ad spending has a long way to fall) or Facebook (since monetizing mobile attention is so devilishly difficult.)
(2) Glam Media Is Huge! Bigger than Wikipedia or Apple. The only Internet properties with more US users are Google, Microsoft, Facebook, and Yahoo.
(3) This Is How Fast the Smartphone Leaderboard Changed. Apple iOS and Android were invisible in 2005, but as the smartphone market exploded, so did they.
(4) Today, the Internet Is Photos, But That's a Really, Really Recent Phenomenon. And Snapchat's growth is absolutely insane.
(5) Facebook Is the Only Major Social Media With Declining Use in 2012. Uh oh?
(6) Wow, Saudi Arabia Really Loves to Share. And Americans are weirdly private.
(9) Apple and Samsung Ate the World. Smartphones are arguably the central device in the digital economy, and Apple and Samsung have doubled their collective market share even as smartphone units quadrupled around the world.
(10) Smartphone users reach to their phone 150 times a day! About a third of those reaches are for messaging and calls. (Also, who needs to check their alarm 8 times a day? My lord.)
(11) The Era of Windows and Intel (WinTel) Was Astonishingly Dominant, and Now It's Over. It's the ApAnd era now when it comes to personal computing platforms.
Bangkok -- already home to the world's two most-photographed locations on Instagram -- is now the most visited city in the world by international tourists, according to the new Global Destination City Index, eking out London by less than one percent. But foreign visitors still spend more in New York than any other city by a wide margin.
A map of global tourism spending is really a map of people and money. Southeast Asian tourism has exploded with the growth of the region's upper-middle class. Chinese tourists have surged to 83 million per year, as Jake Maxwell Watts reports. Eleven of the 12 cities with the fastest increase in air travel connectivity are "east and south of Istanbul," reflecting the growth of wealth in Asia and its growing importance as a destination for businesses from the U.S. and Europe.
Global tourism spending trends don't just tell us where populations are growing, but also where the rich are getting richer. Since 2009, spending by international tourists has grown more than twice as fast as the world GDP, a clear indication that international tourism is a barometer of the world's wealthier families and businesses, not merely its median households.
The Tourism Capitals
There are actually three "top" tourist destinations, depending on what metric you consider the most important. The most-visited city is Bangkok, which draws its tourism from Chinese cities large and small, but also strongly from Singapore and Tokyo.
New York accommodates 4 million fewer tourists, but they spend considerably more on overpriced Broadway tickets and plastic Statuettes of Liberty. In particular, Brazilian tourists apparently go absolutely nuts in Manhattan, spending an astonishing $2200 per visitor.
But that's not such an astonishing number when you consider that it's the average spend by ALL tourists to Tokyo. The most expensive city in the world, as measured by total spending by total number of tourists, is still the capital of Japan.
Texas is killing it.
It dominated the recession, crushed the recovery, and in a new analysis of jobs recovered since the downturn, its largest city stands apart as the most powerful job engine in the country -- by far.
The ten largest metros have recovered 98 percent of the jobs lost during the recession, on average. But Houston, the first major city to regain all the jobs lost in the downturn, has now added more than two jobs for every one it lost after the crash. That's incredible.
The Arab Oil Embargo in 1973 quadrupled oil prices in just three months, sparking a drilling boom that at one point accounted for half of all jobs in Houston's export sectors. But when oil prices collapsed in 1982, oil and mining jobs fell by 57 percent. "By the time Houston's economy hit bottom in January '87," Jankowski said in an email, "the region had 221,900 fewer jobs than it had five years earlier."
But the energy industry avoided a dramatic boom/bust cycle this time around. "The region lost one in 22 jobs this recession versus one in seven jobs during the recession of the '80s," he said. Why were layoffs so mild? The story I've typically heard and reported is that energy prices fell later and recovered earlier than the rest of the economy. But Jankowski has another surprising theory. Houston's energy sector is remarkably old -- the average age is over 50 -- and companies were nervous about laying off too many veteran workers before they had time to pass their skills down to the younger generation. Houston's energy demographics "helped to moderate energy industry job losses," leading to fewer job losses overall.
The 1980s also taught Houston a lesson about real estate. Between 1982 and 1987, Houston suffered "one of the worst regional recessions in U.S. history," Jankowski said. The metro area lost more
than 220,000 jobs -- one in seven in
the region -- but added nearly 188,000
housing units, as developers ignored the signs that demand had
plummeted. The results were disastrous and scarring for the real estate industry.
avoided over-building problems in this recession by tightening lending
and home construction in the early years of the crisis. Houston didn't really have a housing bubble in the 2000s. The ratio between its median house prices and median household incomes peaked at 2.7 in 2006. By comparison, a typical Miami family would have to spend
five-and-a-half years of their total income to afford an average home in
the city by 2006. In Riverside, it would take nearly seven years. So as housing values cascading all across along the Sun Belt -- by 40 percent
percent in Miami and 44 percent in
Riverside -- they merely dipped about 2 percent in Houston.
Hitch a Ride
The Great Recession was worse than every post-war recession in just about any metric you pick. Except for one. Global trade continued to climb after the nadir thanks to developing countries like China carrying what was left of world growth.
Houston was uniquely poised to capture the gains from a growing world, due to its proximity to Latin America and its strength in energy. Between 2008 and 2010, "more than 100 foreign-owned companies relocated, expanded or started new businesses in Houston," Jankowski wrote.
While moderation protected the city during the 2000s, an openness to overseas (or over-the-boarder) business boosted job creation at a time that domestic demand was lagging badly. Although human mistakes can muck up the blessings of topography and geography in Houston (and they have), the city's long memory helped it avoid the same mistakes of over-building and over-firing that plagued other cities and states.
Sign up to receive our free newsletters