In between the happiness of Christmas and the promise of the New Year, permit me to introduce a sour note, a hint of a scold. If you're like, well, almost everybody, you're not saving enough. 15% of each paycheck into the 401(k) is the bare minimum you can get away with, not some aspirational level you can maybe hope to hit someday when you don't have all these problems.
I mean, obviously if one out of two workers in your household just lost their job, or has been stricken with some horrid cancer requiring all sorts of ancillary expenses, then it's okay to cut back on the retirement savings for a bit. But let's be honest: that doesn't describe most of us in those years when we don't save enough.
What describes most of those years when we aren't saving is normal life. We moved. We got married or had kids. The kids required entirely expected things like food, clothes, and schooling. Work was hard and we felt we wanted a really nice vacation. Friends and family went through the same normal life stages that we were, requesting that we travel and bring gifts to the happy events.
These things are not an excuse to stop saving, for all that I have used these excuses myself from time (and regretted it later, at length). The recession should have driven home some hard facts, but the nation's 3.5% personal savings rate indicates that these lessons haven't quite sunk in, so let me elaborate some of them.
1. You cannot count on high asset growth rates to bail out a low savings rate. In the 1990s, we believed that we could guarantee something like an 8% (average) annual return by pumping our money into the stock market and leaving it there. The problem is, this may no longer be true. For the last few decades, there have been a number of factors pushing up the price of stocks:
a. Low interest rates on bonds prompted investors to look for higher returns elsewhere
b. People started believing that over the long term, equities offered a low-risk opportunity for higher returns. Unfortunately in finance, many things are only true if no one believes they are true. If everyone thinks that equities are low risk, they will bid away the "equity premium"--which is to say, the discount that buyers expected for assuming greater risk. At which point, stocks no longer offer a low-risk excess return.
c. Baby boomers who had undersaved started pouring money into the stock market in an attempt to make up for their lack of savings.
However, stock prices cannot indefinitely grow faster than corporate profits; eventually, you run out of greater fools. And future corporate profits are going to be constrained by slower growth in the workforce as baby boomers retire, and by the taxes needed to pay for all the bailouts and stimulus we just did. Unless there's a sudden boom in productivity--entirely possible, but entirely impossible to predict, or count on--there's every reason to expect that stock markets performance will continue to grow more slowly, and be more volatile, than we got used to.
We saw a similar cycle in houses. A mortgage used to be a form of forced saving that gave you an (almost) free place to live in retirement and a little bit of value when you sold the house. We didn't realize that a number of developments had been pushing up the price of homes:
a. The development of the 30-year self-amortizing mortgage, which enabled people to pay a much higher price for a given house than they would have in the era of 5-year balloon mortgages.
b. The baby boom, which increased demand for houses as they aged
c. The run-up in inflation in the 1970s, which gave (relatively inflation-proof) real estate a boost--and then the subsequent decline in inflation (and interest rates), which gave people the illusion of being able to afford more house because the up-front payments were lower.
d. More widely available credit, which let more people take on bigger loans
e. The increasing value of (and competition for) a small number of slots at selective colleges, which put a rising premium on houses in good school districts
These trends gave people the illusion that houses were, in some fundamental way, an "excellent investment". But they're risky in all sorts of ways: neighborhoods can get worse rather than better, local economies can stagnate, the style of your home can go out of fashion.
Moreover, like the stock market, houses are still pretty expensive by historical standards, as this chart from Barry Ritholtz shows:
If you can't count on a steep run-up in asset prices to build up your retirement savings, that leaves you with one alternative: save a much bigger chunk of your income.
2. People are still living longer in retirement. The increases in life expectancy post-retirement aren't as dramatic as they were in the antibiotic era, but they're still creeping up. That means that you have to take smaller sums out of the kitty each year, so that what you have left will be enough to live on.
3. Government finances are extremely strained. The Baby Boomers are about to dump an even heavier load on them. That means yes, higher taxes--but it also means that despite their formidable voting power, retirements financed mostly on the public dime are very likely to get leaner. Especially because birthrates are falling everywhere--which means that the supply of young, strong-backed immigrants to man the nursing homes will not be as ample as it is now.
4. Employers are not kind to older workers. I wish this weren't so, but I'm very much afraid it is. People who say "I won't be able to retire" may not be given a choice in the matter. Like most modern economies, we've cut a societal deal where you're underpaid in your twenties, and overpaid in your fifties and sixties . . . and as a result, it's very tempting to fire those overpaid oldsters when times get tough.
And once you're forced out in your fifties, it is very, very hard to find a new job of any sort, much less one that pays what you're used to. Even if you're willing to take a big paycut to work a less prestigious job, employers are reluctant to hire the overqualified--particularly since 99 times out of 100 the overqualified 55-year old simply does not have the stamina or the life flexibility of the single twenty-somethings who are applying for the same job. And physically, you may not be able to do many of the low rent jobs that paid your way through college: by the time you're sixty, you're quite likely to have back, joint, or skeletal problems that make it hard to stand on your feet all day or lift heavy objects.
The upshot is that you can no longer plan on "making up" anemic retirement contributions later. You have to start making them--right now.
5. Emergencies seem to be lasting longer than they used to. Before the 1990s, unemployment used to crater sharply during recessions, then recover quickly along with demand. We had our first "jobless recovery" under Clinton, and now we've got two more under our belt. That means that the old advice of three to six months worth of emergency funds are no longer enough. 8 months to 1 year is more realistic.
When I write these posts, I generally get two types of responses: people who smugly tell me that they are saving 30% or more of their income (way to go!) and people who tell me that it is simply not possible for them to save t15-20% of their income.
You know better than I, of course. But most of the research on consumer finance shows the same thing: people can usually save a lot more if they make saving a priority. Most people don't. Savings is an afterthought--it's the residual of whatever hasn't been spent on clothes, groceries, cars, dinners out, school trips, travel soccer team, college tuition, vacation, etc. Unsurprisingly, there's frequently no residual. However, if people decide how much to save, and then budget their consumption out of what is left, they suddenly realize that they could drive an uglier car, take the kids out of dance class, live with the kitchen the way it is, stay home for a week in August instead of going to Disneyworld, and so forth. And those people are not, as you might think prospectively, made desperately unhappy by these sacrifices. Savers are actually happier than the general population--in part, one assumes, because they're less worried.
Many people tell me they can't save because children are so expensive. Children are indeed very expensive. But they're getting more expensive every year, and that's because we're spending more money on them. We're spending more money on houses to get them into good school districts, on activities so that they have every chance to get into Harvard (or the NHL), on clothes and cell phones and video game consoles and the list is endless, plus then there's that tuition to Harvard or some sort of even-more-expensive smaller private college.
These expenses are optional, not mandatory. And before you tell me about how unhappy your child will be if you do not buy him all of these necessities, think about how unhappy he's going to be if you have to move in with him. Better yet, volunteer for some outreach to the bankrupt seniors whose kids wouldn't let them move in, and see how their lives are going.
This is not to criticize. Saving is hard, which is why, just like you, we're trying to figure out how to hit even more ambitious savings goals in the New Year. And consumption is fun. That's why most people struggle to save very much.
But a lot of people are going along on autopilot; they're saving 5% because it seemed safe when they were 25 and so what if they're now 37? They look at the neighbors spending a fortune on cars and school activities and figure that if it's safe for them, it must be safe for me too. But this is the opposite of the truth. If your neighbors aren't saving much (and trust me, they aren't), that means a less productive economy in the future--and more people trying to claim a very limited supply of public funds. You don't want to be among them.
It helps to remember that the object is not to turn yourself into a miser; it's to make your spending patterns sustainable. Your splurges will actually be a lot more fun if you know that they aren't putting you at risk of bankruptcy, foreclosure or a retirement in poverty.
If you're not saving enough--and you know who you are--don't decide today that you're going to save 15%, and then forget about it tomorrow when you realize how daunting a task that will be. Instead, try this: divert an extra 5% of your income into a 401(k), IRA, or other tax-advantaged savings plan. If your 401(k) is stuffed but you don't have much of an emergency fund--or if, for some reason, you don't qualify for tax-advantaged savings--have 7% of every paycheck diverted to a bank account which isn't linked to your other accounts. It's a slow week at work, the perfect time to fuss with HR paperwork.
The important thing is to pay yourself first. Savings should be the first thing you do, not the last. After you've saved, then you budget your consumption. I won't tell you what to cut, because when you confront your new, slightly leaner budget, you'll be perfectly able to calculate what's no longer worth the money to you. I think you'll be pleasantly surprised to find that after a few weeks or a few months of initial pinch, you won't remember that you miss the money much.
If at the end of the year, you still aren't saving enough, then you can do the same thing again--pull another 5-7% out of every paycheck. Within a few years, you'll be at a healthy level of savings, without excessive fiscal pain.
But the most important thing is this: don't start looking for reasons you can't. If you hunt hard enough, you'll find them. Unfortunately, those reasons aren't going to do a damn thing to pay your house payment if you get laid off, or keep you in prescription drugs when you retire.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
Some people see threats even when none are present. Strangely, it can make them more creative.
For much of his life, Isaac Newton seemed like he was on the verge of a nervous breakdown. In 1693, the collapse finally arrived: After not sleeping for five days straight, Newton sent letters accusing his friends of conspiring against him. He was refraining from publishing books, he said at one point that year, “for fear that disputes and controversies may be raised against me by ignoramuses.”
Newton was, by many accounts, highly neurotic. Brilliant, but neurotic nonetheless. He was prone to depressive jags, mistrust, and angry outbursts.
Unfortunately, his genius might have been rooted in his maladjustments. His mental state led him to brood over past mistakes, and eventually, a breakthrough would dawn. “I keep the subject constantly before me,” he once said, “and wait till the first dawnings open slowly, by little and little, into a full and clear light.”
A tattooed, profanity-loving Lutheran pastor believes young people are drawn to Jesus, tradition, and brokenness.
“When Christians really critique me for using salty language, I literally don’t give a shit.”
This is what it’s like to talk to Nadia Bolz-Weber, the tattooed Lutheran pastor, former addict, and head of a Denver church that’s 250 members strong. She’s frank and charming, and yes, she tends to cuss—colorful words pepper her new book, Accidental Saints. But she also doesn’t put a lot of stock in her own schtick.
“Oh, here’s this tattooed pastor who is a recovering alcoholic who used to be a stand-up comic—that’s interesting for like five minutes,” she said. “The fact that people want to hear from me—that, I really feel, has less to do with me and more to do with a Zeitgeist issue.”
Many educators are introducing meditation into the classroom as a means of improving kids’ attention and emotional regulation.
A five-minute walk from the rickety, raised track that carries the 5 train through the Bronx, the English teacher Argos Gonzalez balanced a rounded metal bowl on an outstretched palm. His class—a mix of black and Hispanic students in their late teens, most of whom live in one of the poorest districts in New York City—by now were used to the sight of this unusual object: a Tibetan meditation bell.
“Today we’re going to talk about mindfulness of emotion,” Gonzalez said with a hint of a Venezuelan accent. “You guys remember what mindfulness is?” Met with quiet stares, Gonzalez gestured to one of the posters pasted at the back of the classroom, where the students a few weeks earlier had brainstormed terms describing the meaning of “mindfulness.” There were some tentative mumblings: “being focused,” “being aware of our surroundings.”
What do Google's trippy neural network-generated images tell us about the human mind?
When a collection of artificial brains at Google began generating psychedelic images from otherwise ordinary photos, engineers compared what they saw to dreamscapes. They named their image-generation technique Inceptionism and called the code used to power it Deep Dream.
But many of the people who saw the images reacted the same way: These things didn’t come from a dream world. They came from an acid trip.
The computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.
The idea behind the project was to test the extent to which a neural network had learned to recognize various animals and landscapes by asking the computer to describe what it saw. So, instead of just showing a computer a picture of a tree and saying, "tell me what this is," engineers would show the computer an image and say, "enhance whatever it is you see."