In between the happiness of Christmas and the promise of the New Year, permit me to introduce a sour note, a hint of a scold. If you're like, well, almost everybody, you're not saving enough. 15% of each paycheck into the 401(k) is the bare minimum you can get away with, not some aspirational level you can maybe hope to hit someday when you don't have all these problems.
I mean, obviously if one out of two workers in your household just lost their job, or has been stricken with some horrid cancer requiring all sorts of ancillary expenses, then it's okay to cut back on the retirement savings for a bit. But let's be honest: that doesn't describe most of us in those years when we don't save enough.
What describes most of those years when we aren't saving is normal life. We moved. We got married or had kids. The kids required entirely expected things like food, clothes, and schooling. Work was hard and we felt we wanted a really nice vacation. Friends and family went through the same normal life stages that we were, requesting that we travel and bring gifts to the happy events.
These things are not an excuse to stop saving, for all that I have used these excuses myself from time (and regretted it later, at length). The recession should have driven home some hard facts, but the nation's 3.5% personal savings rate indicates that these lessons haven't quite sunk in, so let me elaborate some of them.
1. You cannot count on high asset growth rates to bail out a low savings rate. In the 1990s, we believed that we could guarantee something like an 8% (average) annual return by pumping our money into the stock market and leaving it there. The problem is, this may no longer be true. For the last few decades, there have been a number of factors pushing up the price of stocks:
a. Low interest rates on bonds prompted investors to look for higher returns elsewhere
b. People started believing that over the long term, equities offered a low-risk opportunity for higher returns. Unfortunately in finance, many things are only true if no one believes they are true. If everyone thinks that equities are low risk, they will bid away the "equity premium"--which is to say, the discount that buyers expected for assuming greater risk. At which point, stocks no longer offer a low-risk excess return.
c. Baby boomers who had undersaved started pouring money into the stock market in an attempt to make up for their lack of savings.
However, stock prices cannot indefinitely grow faster than corporate profits; eventually, you run out of greater fools. And future corporate profits are going to be constrained by slower growth in the workforce as baby boomers retire, and by the taxes needed to pay for all the bailouts and stimulus we just did. Unless there's a sudden boom in productivity--entirely possible, but entirely impossible to predict, or count on--there's every reason to expect that stock markets performance will continue to grow more slowly, and be more volatile, than we got used to.
We saw a similar cycle in houses. A mortgage used to be a form of forced saving that gave you an (almost) free place to live in retirement and a little bit of value when you sold the house. We didn't realize that a number of developments had been pushing up the price of homes:
a. The development of the 30-year self-amortizing mortgage, which enabled people to pay a much higher price for a given house than they would have in the era of 5-year balloon mortgages.
b. The baby boom, which increased demand for houses as they aged
c. The run-up in inflation in the 1970s, which gave (relatively inflation-proof) real estate a boost--and then the subsequent decline in inflation (and interest rates), which gave people the illusion of being able to afford more house because the up-front payments were lower.
d. More widely available credit, which let more people take on bigger loans
e. The increasing value of (and competition for) a small number of slots at selective colleges, which put a rising premium on houses in good school districts
These trends gave people the illusion that houses were, in some fundamental way, an "excellent investment". But they're risky in all sorts of ways: neighborhoods can get worse rather than better, local economies can stagnate, the style of your home can go out of fashion.
Moreover, like the stock market, houses are still pretty expensive by historical standards, as this chart from Barry Ritholtz shows:
If you can't count on a steep run-up in asset prices to build up your retirement savings, that leaves you with one alternative: save a much bigger chunk of your income.
2. People are still living longer in retirement. The increases in life expectancy post-retirement aren't as dramatic as they were in the antibiotic era, but they're still creeping up. That means that you have to take smaller sums out of the kitty each year, so that what you have left will be enough to live on.
3. Government finances are extremely strained. The Baby Boomers are about to dump an even heavier load on them. That means yes, higher taxes--but it also means that despite their formidable voting power, retirements financed mostly on the public dime are very likely to get leaner. Especially because birthrates are falling everywhere--which means that the supply of young, strong-backed immigrants to man the nursing homes will not be as ample as it is now.
4. Employers are not kind to older workers. I wish this weren't so, but I'm very much afraid it is. People who say "I won't be able to retire" may not be given a choice in the matter. Like most modern economies, we've cut a societal deal where you're underpaid in your twenties, and overpaid in your fifties and sixties . . . and as a result, it's very tempting to fire those overpaid oldsters when times get tough.
And once you're forced out in your fifties, it is very, very hard to find a new job of any sort, much less one that pays what you're used to. Even if you're willing to take a big paycut to work a less prestigious job, employers are reluctant to hire the overqualified--particularly since 99 times out of 100 the overqualified 55-year old simply does not have the stamina or the life flexibility of the single twenty-somethings who are applying for the same job. And physically, you may not be able to do many of the low rent jobs that paid your way through college: by the time you're sixty, you're quite likely to have back, joint, or skeletal problems that make it hard to stand on your feet all day or lift heavy objects.
The upshot is that you can no longer plan on "making up" anemic retirement contributions later. You have to start making them--right now.
5. Emergencies seem to be lasting longer than they used to. Before the 1990s, unemployment used to crater sharply during recessions, then recover quickly along with demand. We had our first "jobless recovery" under Clinton, and now we've got two more under our belt. That means that the old advice of three to six months worth of emergency funds are no longer enough. 8 months to 1 year is more realistic.
When I write these posts, I generally get two types of responses: people who smugly tell me that they are saving 30% or more of their income (way to go!) and people who tell me that it is simply not possible for them to save t15-20% of their income.
You know better than I, of course. But most of the research on consumer finance shows the same thing: people can usually save a lot more if they make saving a priority. Most people don't. Savings is an afterthought--it's the residual of whatever hasn't been spent on clothes, groceries, cars, dinners out, school trips, travel soccer team, college tuition, vacation, etc. Unsurprisingly, there's frequently no residual. However, if people decide how much to save, and then budget their consumption out of what is left, they suddenly realize that they could drive an uglier car, take the kids out of dance class, live with the kitchen the way it is, stay home for a week in August instead of going to Disneyworld, and so forth. And those people are not, as you might think prospectively, made desperately unhappy by these sacrifices. Savers are actually happier than the general population--in part, one assumes, because they're less worried.
Many people tell me they can't save because children are so expensive. Children are indeed very expensive. But they're getting more expensive every year, and that's because we're spending more money on them. We're spending more money on houses to get them into good school districts, on activities so that they have every chance to get into Harvard (or the NHL), on clothes and cell phones and video game consoles and the list is endless, plus then there's that tuition to Harvard or some sort of even-more-expensive smaller private college.
These expenses are optional, not mandatory. And before you tell me about how unhappy your child will be if you do not buy him all of these necessities, think about how unhappy he's going to be if you have to move in with him. Better yet, volunteer for some outreach to the bankrupt seniors whose kids wouldn't let them move in, and see how their lives are going.
This is not to criticize. Saving is hard, which is why, just like you, we're trying to figure out how to hit even more ambitious savings goals in the New Year. And consumption is fun. That's why most people struggle to save very much.
But a lot of people are going along on autopilot; they're saving 5% because it seemed safe when they were 25 and so what if they're now 37? They look at the neighbors spending a fortune on cars and school activities and figure that if it's safe for them, it must be safe for me too. But this is the opposite of the truth. If your neighbors aren't saving much (and trust me, they aren't), that means a less productive economy in the future--and more people trying to claim a very limited supply of public funds. You don't want to be among them.
It helps to remember that the object is not to turn yourself into a miser; it's to make your spending patterns sustainable. Your splurges will actually be a lot more fun if you know that they aren't putting you at risk of bankruptcy, foreclosure or a retirement in poverty.
If you're not saving enough--and you know who you are--don't decide today that you're going to save 15%, and then forget about it tomorrow when you realize how daunting a task that will be. Instead, try this: divert an extra 5% of your income into a 401(k), IRA, or other tax-advantaged savings plan. If your 401(k) is stuffed but you don't have much of an emergency fund--or if, for some reason, you don't qualify for tax-advantaged savings--have 7% of every paycheck diverted to a bank account which isn't linked to your other accounts. It's a slow week at work, the perfect time to fuss with HR paperwork.
The important thing is to pay yourself first. Savings should be the first thing you do, not the last. After you've saved, then you budget your consumption. I won't tell you what to cut, because when you confront your new, slightly leaner budget, you'll be perfectly able to calculate what's no longer worth the money to you. I think you'll be pleasantly surprised to find that after a few weeks or a few months of initial pinch, you won't remember that you miss the money much.
If at the end of the year, you still aren't saving enough, then you can do the same thing again--pull another 5-7% out of every paycheck. Within a few years, you'll be at a healthy level of savings, without excessive fiscal pain.
But the most important thing is this: don't start looking for reasons you can't. If you hunt hard enough, you'll find them. Unfortunately, those reasons aren't going to do a damn thing to pay your house payment if you get laid off, or keep you in prescription drugs when you retire.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?
When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.
Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.
A controversial treatment shows promise, especially for victims of trauma.
It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)
Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.
Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.
As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.
He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
The authors in the running for Britain's most prestigious literary award come from seven countries and include seven women writers.
The longlist for the Man Booker Prize, one of the most prestigious literary awards, was announced Wednesday. For the second year, the prize was open to writers of any nationality who publish books in English in the U.K., and this year five American writers made the list of 13 contenders, chosen by five judges from a pool of 156 total works.
The U.S. is, in fact, the most well-represented country, with other entrants hailing from Great Britain, Jamaica, New Zealand, Nigeria, Ireland, and India. There are three debut novelists and one former winner on the list, and women writers outnumber men seven to six. From dystopian and political novels to a multitude of iterations on the family drama, the selections capture the ever-changing human experience in very different ways.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, Prince Charles surprised by an eagle, wildfire in California, a sunset in Crimea, and much more.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, fireworks in North Korea, Prince Charles surprised by an eagle, wildfire in California, protests in the Philippines and Turkey, a sunset in Crimea, and much more.
Members of Colombia's younger generation say they “will not torture for tradition.”
MEDELLÍN, Colombia—On a scorching Saturday in February, hundreds of young men and women in Medellín stripped down to their swimsuit bottoms, slathered themselves in black and red paint, and sprawled out on the hot cement in Los Deseos Park in the north of the city. From my vantage point on the roof of a nearby building, the crowd of seminude protesters formed the shape of a bleeding bull—a vivid statement against the centuries-old culture of bullfighting in Colombia.
It wasn’t long ago that Colombia was among the world’s most important countries for bullfighting, due to the quality of its bulls and its large number of matadors. In his 1989 book Colombia: Tierra de Toros (“Colombia: Land of Bulls”), Alberto Lopera chronicled the maturation of the sport that Spanish conquistadors had introduced to South America in the 16th century, from its days as an unorganized brouhaha of bulls and booze in colonial plazas to a more traditional Spanish-style spectacle whose fans filled bullfighting rings across the country.