In between the happiness of Christmas and the promise of the New Year, permit me to introduce a sour note, a hint of a scold. If you're like, well, almost everybody, you're not saving enough. 15% of each paycheck into the 401(k) is the bare minimum you can get away with, not some aspirational level you can maybe hope to hit someday when you don't have all these problems.
I mean, obviously if one out of two workers in your household just lost their job, or has been stricken with some horrid cancer requiring all sorts of ancillary expenses, then it's okay to cut back on the retirement savings for a bit. But let's be honest: that doesn't describe most of us in those years when we don't save enough.
What describes most of those years when we aren't saving is normal life. We moved. We got married or had kids. The kids required entirely expected things like food, clothes, and schooling. Work was hard and we felt we wanted a really nice vacation. Friends and family went through the same normal life stages that we were, requesting that we travel and bring gifts to the happy events.
These things are not an excuse to stop saving, for all that I have used these excuses myself from time (and regretted it later, at length). The recession should have driven home some hard facts, but the nation's 3.5% personal savings rate indicates that these lessons haven't quite sunk in, so let me elaborate some of them.
1. You cannot count on high asset growth rates to bail out a low savings rate. In the 1990s, we believed that we could guarantee something like an 8% (average) annual return by pumping our money into the stock market and leaving it there. The problem is, this may no longer be true. For the last few decades, there have been a number of factors pushing up the price of stocks:
a. Low interest rates on bonds prompted investors to look for higher returns elsewhere
b. People started believing that over the long term, equities offered a low-risk opportunity for higher returns. Unfortunately in finance, many things are only true if no one believes they are true. If everyone thinks that equities are low risk, they will bid away the "equity premium"--which is to say, the discount that buyers expected for assuming greater risk. At which point, stocks no longer offer a low-risk excess return.
c. Baby boomers who had undersaved started pouring money into the stock market in an attempt to make up for their lack of savings.
However, stock prices cannot indefinitely grow faster than corporate profits; eventually, you run out of greater fools. And future corporate profits are going to be constrained by slower growth in the workforce as baby boomers retire, and by the taxes needed to pay for all the bailouts and stimulus we just did. Unless there's a sudden boom in productivity--entirely possible, but entirely impossible to predict, or count on--there's every reason to expect that stock markets performance will continue to grow more slowly, and be more volatile, than we got used to.
We saw a similar cycle in houses. A mortgage used to be a form of forced saving that gave you an (almost) free place to live in retirement and a little bit of value when you sold the house. We didn't realize that a number of developments had been pushing up the price of homes:
a. The development of the 30-year self-amortizing mortgage, which enabled people to pay a much higher price for a given house than they would have in the era of 5-year balloon mortgages.
b. The baby boom, which increased demand for houses as they aged
c. The run-up in inflation in the 1970s, which gave (relatively inflation-proof) real estate a boost--and then the subsequent decline in inflation (and interest rates), which gave people the illusion of being able to afford more house because the up-front payments were lower.
d. More widely available credit, which let more people take on bigger loans
e. The increasing value of (and competition for) a small number of slots at selective colleges, which put a rising premium on houses in good school districts
These trends gave people the illusion that houses were, in some fundamental way, an "excellent investment". But they're risky in all sorts of ways: neighborhoods can get worse rather than better, local economies can stagnate, the style of your home can go out of fashion.
Moreover, like the stock market, houses are still pretty expensive by historical standards, as this chart from Barry Ritholtz shows:
If you can't count on a steep run-up in asset prices to build up your retirement savings, that leaves you with one alternative: save a much bigger chunk of your income.
2. People are still living longer in retirement. The increases in life expectancy post-retirement aren't as dramatic as they were in the antibiotic era, but they're still creeping up. That means that you have to take smaller sums out of the kitty each year, so that what you have left will be enough to live on.
3. Government finances are extremely strained. The Baby Boomers are about to dump an even heavier load on them. That means yes, higher taxes--but it also means that despite their formidable voting power, retirements financed mostly on the public dime are very likely to get leaner. Especially because birthrates are falling everywhere--which means that the supply of young, strong-backed immigrants to man the nursing homes will not be as ample as it is now.
4. Employers are not kind to older workers. I wish this weren't so, but I'm very much afraid it is. People who say "I won't be able to retire" may not be given a choice in the matter. Like most modern economies, we've cut a societal deal where you're underpaid in your twenties, and overpaid in your fifties and sixties . . . and as a result, it's very tempting to fire those overpaid oldsters when times get tough.
And once you're forced out in your fifties, it is very, very hard to find a new job of any sort, much less one that pays what you're used to. Even if you're willing to take a big paycut to work a less prestigious job, employers are reluctant to hire the overqualified--particularly since 99 times out of 100 the overqualified 55-year old simply does not have the stamina or the life flexibility of the single twenty-somethings who are applying for the same job. And physically, you may not be able to do many of the low rent jobs that paid your way through college: by the time you're sixty, you're quite likely to have back, joint, or skeletal problems that make it hard to stand on your feet all day or lift heavy objects.
The upshot is that you can no longer plan on "making up" anemic retirement contributions later. You have to start making them--right now.
5. Emergencies seem to be lasting longer than they used to. Before the 1990s, unemployment used to crater sharply during recessions, then recover quickly along with demand. We had our first "jobless recovery" under Clinton, and now we've got two more under our belt. That means that the old advice of three to six months worth of emergency funds are no longer enough. 8 months to 1 year is more realistic.
When I write these posts, I generally get two types of responses: people who smugly tell me that they are saving 30% or more of their income (way to go!) and people who tell me that it is simply not possible for them to save t15-20% of their income.
You know better than I, of course. But most of the research on consumer finance shows the same thing: people can usually save a lot more if they make saving a priority. Most people don't. Savings is an afterthought--it's the residual of whatever hasn't been spent on clothes, groceries, cars, dinners out, school trips, travel soccer team, college tuition, vacation, etc. Unsurprisingly, there's frequently no residual. However, if people decide how much to save, and then budget their consumption out of what is left, they suddenly realize that they could drive an uglier car, take the kids out of dance class, live with the kitchen the way it is, stay home for a week in August instead of going to Disneyworld, and so forth. And those people are not, as you might think prospectively, made desperately unhappy by these sacrifices. Savers are actually happier than the general population--in part, one assumes, because they're less worried.
Many people tell me they can't save because children are so expensive. Children are indeed very expensive. But they're getting more expensive every year, and that's because we're spending more money on them. We're spending more money on houses to get them into good school districts, on activities so that they have every chance to get into Harvard (or the NHL), on clothes and cell phones and video game consoles and the list is endless, plus then there's that tuition to Harvard or some sort of even-more-expensive smaller private college.
These expenses are optional, not mandatory. And before you tell me about how unhappy your child will be if you do not buy him all of these necessities, think about how unhappy he's going to be if you have to move in with him. Better yet, volunteer for some outreach to the bankrupt seniors whose kids wouldn't let them move in, and see how their lives are going.
This is not to criticize. Saving is hard, which is why, just like you, we're trying to figure out how to hit even more ambitious savings goals in the New Year. And consumption is fun. That's why most people struggle to save very much.
But a lot of people are going along on autopilot; they're saving 5% because it seemed safe when they were 25 and so what if they're now 37? They look at the neighbors spending a fortune on cars and school activities and figure that if it's safe for them, it must be safe for me too. But this is the opposite of the truth. If your neighbors aren't saving much (and trust me, they aren't), that means a less productive economy in the future--and more people trying to claim a very limited supply of public funds. You don't want to be among them.
It helps to remember that the object is not to turn yourself into a miser; it's to make your spending patterns sustainable. Your splurges will actually be a lot more fun if you know that they aren't putting you at risk of bankruptcy, foreclosure or a retirement in poverty.
If you're not saving enough--and you know who you are--don't decide today that you're going to save 15%, and then forget about it tomorrow when you realize how daunting a task that will be. Instead, try this: divert an extra 5% of your income into a 401(k), IRA, or other tax-advantaged savings plan. If your 401(k) is stuffed but you don't have much of an emergency fund--or if, for some reason, you don't qualify for tax-advantaged savings--have 7% of every paycheck diverted to a bank account which isn't linked to your other accounts. It's a slow week at work, the perfect time to fuss with HR paperwork.
The important thing is to pay yourself first. Savings should be the first thing you do, not the last. After you've saved, then you budget your consumption. I won't tell you what to cut, because when you confront your new, slightly leaner budget, you'll be perfectly able to calculate what's no longer worth the money to you. I think you'll be pleasantly surprised to find that after a few weeks or a few months of initial pinch, you won't remember that you miss the money much.
If at the end of the year, you still aren't saving enough, then you can do the same thing again--pull another 5-7% out of every paycheck. Within a few years, you'll be at a healthy level of savings, without excessive fiscal pain.
But the most important thing is this: don't start looking for reasons you can't. If you hunt hard enough, you'll find them. Unfortunately, those reasons aren't going to do a damn thing to pay your house payment if you get laid off, or keep you in prescription drugs when you retire.
Trump’s supporters backed a time-honored American political tradition, disavowing racism while promising to enact a broad agenda of discrimination.
THIRTY YEARS AGO, nearly half of Louisiana voted for a Klansman, and the media struggled to explain why.
It was 1990 and David Duke, the former grand wizard of the Ku Klux Klan, astonished political observers when he came within striking distance of defeating incumbent Democratic U.S. Senator J. Bennett Johnston, earning 43 percent of the vote. If Johnston’s Republican rival hadn’t dropped out of the race and endorsed him at the last minute, the outcome might have been different.
Was it economic anxiety? The Washington Post reported that the state had “a large working class that has suffered through a long recession.” Was it a blow against the state’s hated political establishment? An editorial from United Press International explained, “Louisianans showed the nation by voting for Duke that they were mad as hell and not going to take it any more.” Was it anti-Washington rage? A Loyola University pollster argued, “There were the voters who liked Duke, those who hated J. Bennett Johnston, and those who just wanted to send a message to Washington.”
If the party wants to win back votes in the Trump era, it will need to stop ignoring people of faith.
Democrats ignored broad swaths of religious America in the 2016 election campaign and the nation has suffered because of it. Yet calls for a recommitment to faith outreach—particularly to white and other conservative or moderate religious voters—have been met in some corners of liberal punditry with a response as common as it is unwarranted. Some quarters of the Democratic party would rather maintain rhetorical and ideological purity than win with a more inclusive coalition. For the sake of the country, the party must turn back to people of faith.
We know faith outreach works, because it has worked before. In 2005, after the reelection of a president many Democrats believed was clearly unfit for leadership, a concerted decision was made to close the “God Gap” that the GOP had so effectively exploited. Yes, the Democratic Party was losing among white religious people, but there was also an understanding in the party that its margins among black and Hispanic voters were limited by the perception that the party was antagonistic toward religion. Democrats took back Congress in the 2006 midterms, through a combination of direct engagement, district-based flexibility on policy, and rhetorical adjustments. The majority gained in 2006 is the majority that delivered all of the legislative-policy wins progressives hail from the first two years of the Obama administration, including the Affordable Care Act.
How leaders lose mental capacities—most notably for reading other people—that were essential to their rise
If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?
When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs—“You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)—failed to shake him awake.
The second reason is subtler, but perhaps equally significant. To pay for a permanent tax cut on corporations, the plan raises taxes on colleges and college students, which is part of a broader Republican war on higher education in the U.S. This is a big deal, because in the last half-century, the most important long-term driver of wage growth has arguably been college.
The mass murderer, who died on Sunday at 83, turned one following into another.
“All of us are excited by what we most deplore,” Martin Amis wrote in theLondon Review of Books in 1980, reviewing Joan Didion’s The White Album. In the title piece in that collection, Didion’s second, the essayist recalls sitting in her sister-in-law’s swimming pool in Beverly Hills on August 9, 1969, when the phone rang. The friend on the line had heard that across town there had been a spate of murders at a house rented by the director Roman Polanski, on Cielo Drive. Early reports were frenzied, shocking, lurid, and incorrect. “I remember all of the day’s misinformation very clearly,” Didion writes, “and I also remember this, and wish I did not: I remember that no one was surprised.”
The killings orchestrated that summer by Charles Manson, who died on Sunday at the age of 83, after spending the past 48 years in prison, occupy a unique space in the American cultural psyche. All of the elements of the Tate–LaBianca murders, as they came to be known, seemed designed for maximum tabloid impact. There was the actor Sharon Tate, luminously beautiful and eight months pregnant, who was stabbed to death with four others at a rental home in Hollywood. There were the killers—young women,Manson acolytes corrupted by a sinister cult figure. There were the drugs, abundant both on the Manson Family ranch and at the house on Cielo Drive. There was the nebulous chatter about satanism and witchcraft and race wars ready to erupt. And, as Didion captured, there was a sense that something was rotten from the Hollywood Hills to Haight-Ashbury—that the Summer of Love had long since curdled into paranoia and depravity.
Astronomers describe what it's like to chart the first confirmed object from outside our home in the cosmos.
Nobody saw it coming.
The rocky object showed up in telescope images the night of October 19. The Pan-STARRS1 telescope, from its perch atop a Hawaiian volcano, photographed it during its nightly search for near-Earth objects, like comets and asteroids. Rob Weryk, a postdoctoral researcher at the University of Hawaii Institute for Astronomy, was the first to lay eyes on it, as he sorted through the telescope’s latest haul. The object was moving “rapidly” across the night sky. Weryk thought it was probably a typical asteroid, drifting along in the sun’s orbit
“It was only when I went back and found it [in the data from] the night before that it became obvious it was something else,” he said. “I’d never expected to find something like this.”
How did Andrew Anglin go from being an antiracist vegan to the alt-right’s most vicious troll and propagandist—and how might he be stopped?
On December 16, 2016, Tanya Gersh answered her phone and heard gunshots. Startled, she hung up. Gersh, a real-estate agent who lives in Whitefish, Montana, assumed it was a prank call. But the phone rang again. More gunshots. Again, she hung up. Another call. This time, she heard a man’s voice: “This is how we can keep the Holocaust alive,” he said. “We can bury you without touching you.”
When Gersh put down the phone, her hands were shaking. She was one of only about 100 Jews in Whitefish and the surrounding Flathead Valley, and she knew there were white nationalists and “sovereign citizens” in the area. But Gersh had lived in Whitefish for more than 20 years, since just after college, and had always considered the scenic ski town an idyllic place. She didn’t even have a key to her house—she’d never felt the need to lock her door. Now that sense of security was about to be shattered.
There’s a manifest need to lower corporate tax rates—but instead of building consensus, the GOP is pursuing a bill that’s likely to be rolled back even if it passes.
America badly needs corporate tax reform.
The United States pretends to tax corporations heavily. But those heavy tax rates are perforated by randomly generous rules such that many tax-efficient firms pay nothing at all, or even receive money back from the U.S. Treasury. The result is heavy unfairness between industries and firms, an unfairness that many economists believe systematically distorts investment decisions. U.S. productivity growth has been sluggish since the Great Recession—and had actually turned negative by the beginning of 2016.
At the same time, the corporate share of the federal-tax burden has dwindled over the years and decades. More and more of the cost of government now falls upon the payroll tax, which weighs most heavily on low- and middle-income wage earners. These Americans are suffering stagnating incomes, very probably because of the poor productivity growth of the past half-decade.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
The Facebook founder has discussed "community" more than 150 times in public. A close reading reveals his road map for the platform’s future.
There’s a story that Mark Zuckerberg has told dozens of times over the years. Shortly after he’d launched Facebook in February 2004, he went to get pizza with Kang-Xing Jin, a coder friend who would become a Facebook executive, at a place around the corner from his dorm.
In one telling, Zuckerberg says he was thinking, “this is great that we have this community that now people can connect within our little school, but clearly one day, someone is going to build this for the world.”
But there was no reason to expect that this kid and his group of friends would be the people who would build this for the world. “It hadn’t even crossed my mind,” he said in 2013. They were technically gifted, but as Zuckerberg tells it, they had basically no resources or experience at a time when there were already massive technology companies trying to create social networks from MySpace to Microsoft, Google to Yahoo.