In between the happiness of Christmas and the promise of the New Year, permit me to introduce a sour note, a hint of a scold. If you're like, well, almost everybody, you're not saving enough. 15% of each paycheck into the 401(k) is the bare minimum you can get away with, not some aspirational level you can maybe hope to hit someday when you don't have all these problems.
I mean, obviously if one out of two workers in your household just lost their job, or has been stricken with some horrid cancer requiring all sorts of ancillary expenses, then it's okay to cut back on the retirement savings for a bit. But let's be honest: that doesn't describe most of us in those years when we don't save enough.
What describes most of those years when we aren't saving is normal life. We moved. We got married or had kids. The kids required entirely expected things like food, clothes, and schooling. Work was hard and we felt we wanted a really nice vacation. Friends and family went through the same normal life stages that we were, requesting that we travel and bring gifts to the happy events.
These things are not an excuse to stop saving, for all that I have used these excuses myself from time (and regretted it later, at length). The recession should have driven home some hard facts, but the nation's 3.5% personal savings rate indicates that these lessons haven't quite sunk in, so let me elaborate some of them.
1. You cannot count on high asset growth rates to bail out a low savings rate. In the 1990s, we believed that we could guarantee something like an 8% (average) annual return by pumping our money into the stock market and leaving it there. The problem is, this may no longer be true. For the last few decades, there have been a number of factors pushing up the price of stocks:
a. Low interest rates on bonds prompted investors to look for higher returns elsewhere
b. People started believing that over the long term, equities offered a low-risk opportunity for higher returns. Unfortunately in finance, many things are only true if no one believes they are true. If everyone thinks that equities are low risk, they will bid away the "equity premium"--which is to say, the discount that buyers expected for assuming greater risk. At which point, stocks no longer offer a low-risk excess return.
c. Baby boomers who had undersaved started pouring money into the stock market in an attempt to make up for their lack of savings.
However, stock prices cannot indefinitely grow faster than corporate profits; eventually, you run out of greater fools. And future corporate profits are going to be constrained by slower growth in the workforce as baby boomers retire, and by the taxes needed to pay for all the bailouts and stimulus we just did. Unless there's a sudden boom in productivity--entirely possible, but entirely impossible to predict, or count on--there's every reason to expect that stock markets performance will continue to grow more slowly, and be more volatile, than we got used to.
We saw a similar cycle in houses. A mortgage used to be a form of forced saving that gave you an (almost) free place to live in retirement and a little bit of value when you sold the house. We didn't realize that a number of developments had been pushing up the price of homes:
a. The development of the 30-year self-amortizing mortgage, which enabled people to pay a much higher price for a given house than they would have in the era of 5-year balloon mortgages.
b. The baby boom, which increased demand for houses as they aged
c. The run-up in inflation in the 1970s, which gave (relatively inflation-proof) real estate a boost--and then the subsequent decline in inflation (and interest rates), which gave people the illusion of being able to afford more house because the up-front payments were lower.
d. More widely available credit, which let more people take on bigger loans
e. The increasing value of (and competition for) a small number of slots at selective colleges, which put a rising premium on houses in good school districts
These trends gave people the illusion that houses were, in some fundamental way, an "excellent investment". But they're risky in all sorts of ways: neighborhoods can get worse rather than better, local economies can stagnate, the style of your home can go out of fashion.
Moreover, like the stock market, houses are still pretty expensive by historical standards, as this chart from Barry Ritholtz shows:
If you can't count on a steep run-up in asset prices to build up your retirement savings, that leaves you with one alternative: save a much bigger chunk of your income.
2. People are still living longer in retirement. The increases in life expectancy post-retirement aren't as dramatic as they were in the antibiotic era, but they're still creeping up. That means that you have to take smaller sums out of the kitty each year, so that what you have left will be enough to live on.
3. Government finances are extremely strained. The Baby Boomers are about to dump an even heavier load on them. That means yes, higher taxes--but it also means that despite their formidable voting power, retirements financed mostly on the public dime are very likely to get leaner. Especially because birthrates are falling everywhere--which means that the supply of young, strong-backed immigrants to man the nursing homes will not be as ample as it is now.
4. Employers are not kind to older workers. I wish this weren't so, but I'm very much afraid it is. People who say "I won't be able to retire" may not be given a choice in the matter. Like most modern economies, we've cut a societal deal where you're underpaid in your twenties, and overpaid in your fifties and sixties . . . and as a result, it's very tempting to fire those overpaid oldsters when times get tough.
And once you're forced out in your fifties, it is very, very hard to find a new job of any sort, much less one that pays what you're used to. Even if you're willing to take a big paycut to work a less prestigious job, employers are reluctant to hire the overqualified--particularly since 99 times out of 100 the overqualified 55-year old simply does not have the stamina or the life flexibility of the single twenty-somethings who are applying for the same job. And physically, you may not be able to do many of the low rent jobs that paid your way through college: by the time you're sixty, you're quite likely to have back, joint, or skeletal problems that make it hard to stand on your feet all day or lift heavy objects.
The upshot is that you can no longer plan on "making up" anemic retirement contributions later. You have to start making them--right now.
5. Emergencies seem to be lasting longer than they used to. Before the 1990s, unemployment used to crater sharply during recessions, then recover quickly along with demand. We had our first "jobless recovery" under Clinton, and now we've got two more under our belt. That means that the old advice of three to six months worth of emergency funds are no longer enough. 8 months to 1 year is more realistic.
When I write these posts, I generally get two types of responses: people who smugly tell me that they are saving 30% or more of their income (way to go!) and people who tell me that it is simply not possible for them to save t15-20% of their income.
You know better than I, of course. But most of the research on consumer finance shows the same thing: people can usually save a lot more if they make saving a priority. Most people don't. Savings is an afterthought--it's the residual of whatever hasn't been spent on clothes, groceries, cars, dinners out, school trips, travel soccer team, college tuition, vacation, etc. Unsurprisingly, there's frequently no residual. However, if people decide how much to save, and then budget their consumption out of what is left, they suddenly realize that they could drive an uglier car, take the kids out of dance class, live with the kitchen the way it is, stay home for a week in August instead of going to Disneyworld, and so forth. And those people are not, as you might think prospectively, made desperately unhappy by these sacrifices. Savers are actually happier than the general population--in part, one assumes, because they're less worried.
Many people tell me they can't save because children are so expensive. Children are indeed very expensive. But they're getting more expensive every year, and that's because we're spending more money on them. We're spending more money on houses to get them into good school districts, on activities so that they have every chance to get into Harvard (or the NHL), on clothes and cell phones and video game consoles and the list is endless, plus then there's that tuition to Harvard or some sort of even-more-expensive smaller private college.
These expenses are optional, not mandatory. And before you tell me about how unhappy your child will be if you do not buy him all of these necessities, think about how unhappy he's going to be if you have to move in with him. Better yet, volunteer for some outreach to the bankrupt seniors whose kids wouldn't let them move in, and see how their lives are going.
This is not to criticize. Saving is hard, which is why, just like you, we're trying to figure out how to hit even more ambitious savings goals in the New Year. And consumption is fun. That's why most people struggle to save very much.
But a lot of people are going along on autopilot; they're saving 5% because it seemed safe when they were 25 and so what if they're now 37? They look at the neighbors spending a fortune on cars and school activities and figure that if it's safe for them, it must be safe for me too. But this is the opposite of the truth. If your neighbors aren't saving much (and trust me, they aren't), that means a less productive economy in the future--and more people trying to claim a very limited supply of public funds. You don't want to be among them.
It helps to remember that the object is not to turn yourself into a miser; it's to make your spending patterns sustainable. Your splurges will actually be a lot more fun if you know that they aren't putting you at risk of bankruptcy, foreclosure or a retirement in poverty.
If you're not saving enough--and you know who you are--don't decide today that you're going to save 15%, and then forget about it tomorrow when you realize how daunting a task that will be. Instead, try this: divert an extra 5% of your income into a 401(k), IRA, or other tax-advantaged savings plan. If your 401(k) is stuffed but you don't have much of an emergency fund--or if, for some reason, you don't qualify for tax-advantaged savings--have 7% of every paycheck diverted to a bank account which isn't linked to your other accounts. It's a slow week at work, the perfect time to fuss with HR paperwork.
The important thing is to pay yourself first. Savings should be the first thing you do, not the last. After you've saved, then you budget your consumption. I won't tell you what to cut, because when you confront your new, slightly leaner budget, you'll be perfectly able to calculate what's no longer worth the money to you. I think you'll be pleasantly surprised to find that after a few weeks or a few months of initial pinch, you won't remember that you miss the money much.
If at the end of the year, you still aren't saving enough, then you can do the same thing again--pull another 5-7% out of every paycheck. Within a few years, you'll be at a healthy level of savings, without excessive fiscal pain.
But the most important thing is this: don't start looking for reasons you can't. If you hunt hard enough, you'll find them. Unfortunately, those reasons aren't going to do a damn thing to pay your house payment if you get laid off, or keep you in prescription drugs when you retire.
When it comes to health care and entitlements, the party’s policies don't always align with its coalition’s beliefs.
The Senate Republican health-care bill has been repeatedly crushed in a slow-motion collision between the party’s historic ideology and the interests of its modern electoral coalition. Yet congressional Republicans appear determined to plow right through the wreckage.
Even as the Senate’s latest effort to repeal the Affordable Care Act collapsed on Tuesday, the House Republican leadership released a 10-year federal-budget blueprint that points them toward a similar confrontation, between their dominant small-government dogma and the economic needs of their increasingly blue-collar and older white base.
John F. Kennedy famously said that failure is an orphan. But the failure, at least for now, of the GOP drive against the ACA has many parents. One was a distracted and ineffectual President Trump. Even higher on the list sits Senate Majority Leader Mitch McConnell, who displayed a blinding hubris that will forever cloud his previous reputation for legislative wizardry. Operating with unprecedented secrecy and insularity, McConnell degraded Senate tradition by refusing to hold any public hearings or committee votes on the legislation. His closed-door process provoked not only unified opposition from Democrats, but also every major medical stakeholder. He sought to pressure dissenting senators with unrealistic vote deadlines—then retreated as they repeatedly called his bluff.
Paul Behrends, a controversial staffer associated with the California congressman’s pro-Russia stances, was pushed out of his role on a subcommittee after questions were raised about a recent trip to Moscow.
Paul Behrends, a top aide to Representative Dana Rohrabacher, has been ousted from his role as staff director for the House Foreign Affairs subcommittee that Rohrabacher chairs, after stories appeared in the press highlighting his relationships with pro-Russia lobbyists.
“Paul Behrends no longer works at the committee,” a House Foreign Affairs Committee spokesperson said on Wednesday evening.
Behrends accompanied Rohrabacher on a 2016 trip to Moscow in which Rohrabacher said he received anti-Magnitsky Act materials from prosecutors. The Magnitsky Act is a 2012 bill that imposes sanctions on Russian officials associated with the 2009 death in prison of lawyer Sergei Magnitsky, who had been investigating tax fraud. Natalia Veselnitskaya, the Russian attorney and lobbyist who met with Donald Trump Jr. at Trump Tower last year, reportedly brought up the Magintsky Act during the meeting.
A new study explores why the latter are far more likely to opt for an elite college where they'd struggle than a so-so one where they'd excel.
There’s a saying in China that it’s better to be the head of a chicken than the tail of a phoenix. The premise of the aphorism—it’s better to be over-qualified than under-qualified relative to one’s surroundings—is so widely accepted that similar versions of it exist across cultures. In Japan, they tend to say that it’s better to be the head of a sardine than the tail of a whale. Americans and Brits often declare that it’s better to be a big frog (or fish) in a small pond than a little frog in a big pond.
Extensive research supports these axioms, particularly in the realm of education. Longitudinal studies have consistently shown that high-performing students at less-selective schools feel more competent, have higher GPAs, and have more ambitious career aspirations than low-performing students at more-selective schools.
The story of a duel between two men, one who dies, and the nature of the quest to build artificial intelligence
Marion Tinsley—math professor, minister, and the best checkers player in the world—sat across a game board from a computer, dying.
Tinsley had been the world’s best for 40 years, a time during which he'd lost a handful of games to humans, but never a match. It's possible no single person had ever dominated a competitive pursuit the way Tinsley dominated checkers. But this was a different sort of competition, the Man-Machine World Championship.
His opponent was Chinook, a checkers-playing program programmed by Jonathan Schaeffer, a round, frizzy-haired professor from the University of Alberta, who operated the machine. Through obsessive work, Chinook had become very good. It hadn't lost a game in its last 125—and since they’d come close to defeating Tinsley in 1992, Schaeffer’s team had spent thousands of hours perfecting his machine.
Some focus on the largest figures, like total student debt ($1.3 trillion) and average debt ($30,000.) So why is the most dangerous student loan number less than $5,000?
"I feel I kind of ruined my life by going to college," Jackie Krowen said. She first took out student loans at 19, to go to community college in Oregon. She borrowed more when she transferred to Portland State University, and even more to go to nursing school at the University of Rochester in New York. Now, more than $150,000 in debt, Krowen toldConsumer Reports that she cannot buy a house and fears the specter of her non-dischargeable debt will follow her for the rest of her life.
I read Jackie’s story earlier this summer, and I thought about it constantly while reading the student debt report from the White House's Council of Economic Advisers, which was released today. There’s no doubt that Jackie’s situation is disturbing and sad. It’s not unique: There are many students for whom college is not that promised ticket to the middle class, but rather an albatross that punishes their early adulthood. They are tens of thousands of dollars in debt, in jobs paying half what they expected to earn after college. They cannot buy a home, start a business. They are even afraid to get married and have a kid.
Senator John McCain’s glioblastoma diagnosis revives a longstanding debate over the safety of wireless technology.
Senator John McCain’s brain cancer diagnosis is likely to revive a persistent and complex question about the safety of wireless technologies, like cell phones, that emit electromagnetic radiation.
For years, researchers have explored whether cell-phone use can increase a person’s likelihood of getting cancer. And for years their findings have been mixed—and in many cases controversial. The consensus, if there is one, is that the health risks of regular cell-phone usage are probably quite small, if they exist at all. But it’s hard to prove a negative, so the question remains open-ended.
Doctors discovered McCain’s glioblastoma after the 80-year-old senator underwent a procedure to remove a blood clot, his office announced in a statement from the Mayo Clinic Wednesday night. Glioblastoma is one of the most common and also one of the most aggressive kinds of brain tumor. Senator Ted Kennedy received a similar diagnosis in the spring of 2008. He died from the disease about 15 months later, in August 2009. Could there be a connection between the aggressive form of cancer in Kennedy’s brain and his cell-phone usage? U.S. senators tend to do a good deal of business by phone, especially when they're in Washington and away from their constituencies.
Twenty years ago, Luc Besson’s visually stunning film hinged its story not on action or violence, but on love.
The most radical element of Luc Besson’s 1997 space opera The Fifth Element is not the absurdly opulent future-costumes designed by Jean Paul Gaultier. It isn’t the bizarre Southern twang of the Hitler haircut-sporting villain Zorg (Gary Oldman), nor is it Chris Tucker’s performance as an intergalactic sex symbol who hosts a radio show. It’s that Bruce Willis cries at the opera. In budget, in scale, and in casting, The Fifth Element feels like any other big Hollywood sci-fi movie, featuring popular English-speaking actors running around a high-concept world, complete with lavish sets and CGI effects. But not many blockbusters would let its male star weep at a musical performance.
That set piece comes in the middle of the film as Willis’s character, Korben Dallas, a gun-wielding space cowboy with spiked, peroxide-blonde hair, takes in a show by the blue alien singer Diva Plavalaguna (Maïwenn). Besson’s film has, up until now, been a relentless blitz of action, as Korben follows the mysterious Leeloo (Milla Jovovich) across the galaxy to help retrieve mystical stones that will help her save the world from a great, encroaching evil. But for a second, the movie grinds to a halt, letting Korben take in the extraterrestrial songstress’s solo with tears in his eyes.
Thirty years ago, with the help of a massive coffee table book, the American wedding theatrical complex was born.
There’s a ritual that takes place, several times, during each 22-minute episode of the reality-show juggernaut Say Yes to the Dress. A bride-to-be, who will typically arrive at Kleinfeld’s Manhattan wedding emporium with friends and family in tow, will first introduce the group (her “entourage,” the show will call them) to the person who will be her personal attendant throughout her Kleinfeld Experience. The bride will then be spirited away, from the “Bridal Floor” and its effusions of white, to a simple dressing room. There, she and her attendant will get down to business. “How do you want to look,” the consultant will ask her, with cheerful solemnity, “on your wedding day?”
The bride will reply instantly (“classic,” “ethereal,” “edgy,” “like Beyoncé,” “like a princess”), and if she does not—if, indeed, she betrays any uncertainty about her bridal Look and/or Style and/or Philosophy—the attendant will allow a shadow of disapproval to cross her face. This is part of the ritual. After all, in the Kleinfeld cosmos, a Wedding Day is not really a matter of legal pragmatism, or of religious tradition, or even, really, of love; it is an act of determined transformation. It is a day about Dreams—Dreams whose roots have been growing in the bride’s mind and heart ever since, as it goes, she was a little girl. Dreams made manifest in that most quintessentially American of manners: through the purchase of an extremely expensive piece of clothing.
Most of the country understands that when it comes to government, you pay for what you get.
When I was a young kid growing up in Montreal, our annual family trips to my grandparents’ Florida condo in the 1970s and ‘80s offered glimpses of a better life. Not just Bubbie and Zadie’s miniature, sun-bronzed world of Del Boca Vista, but the whole sprawling infrastructural colossus of Cold War America itself, with its famed interstate highway system and suburban sprawl. Many Canadians then saw themselves as America’s poor cousins, and our inferiority complex asserted itself the moment we got off the plane.
Decades later, the United States presents visitors from the north with a different impression. There hasn’t been a new major airport constructed in the United States since 1995. And the existing stock of terminals is badly in need of upgrades. Much of the surrounding road and rail infrastructure is in even worse shape (the trip from LaGuardia Airport to midtown Manhattan being particularly appalling). Washington, D.C.’s semi-functional subway system feels like a World’s Fair exhibit that someone forgot to close down. Detroit’s 90-year-old Ambassador Bridge—which carries close to $200 billion worth of goods across the Canada-U.S. border annually—has been operating beyond its engineering capacity for years. In 2015, the Canadian government announced it would be paying virtually the entire bill for a new bridge (including, amazingly, the U.S. customs plaza on the Detroit side), after Michigan’s government pled poverty. “We are unable to build bridges, we're unable to build airports, our inner city school kids are not graduating,” is how JPMorgan Chase CEO Jamie Dimon summarized the state of things during an earnings conference call last week. “It’s almost embarrassing being an American citizen.”
In an interview with The New York Times, the president said he never would have chosen his attorney general if he knew he would end up recusing himself from the ongoing federal inquiry into the 2016 election.
President Trump strongly criticized Attorney General Jeff Sessions and the upper ranks of the Justice Department on Wednesday, telling The New York Times he would never have chosen Jeff Sessions as attorney general if he knew Sessions was going to recuse himself from the Russia investigation.
The president delivered the extraordinary public rebuke of a close political ally and key Cabinet official in an Oval Office interview with the Times on Wednesday. “Sessions should have never recused himself, and if he was going to recuse himself, he should have told me before he took the job and I would have picked somebody else,” the president said.
Sessions, one of Trump’s earliest high-profile supporters, recused himself in March after media reports that he had met with Russian officials during the campaign, in direct contradiction of his testimony before the Senate that he had no contact with any such officials. The revelations meant Sessions could be questioned by investigators as part of the sprawling federal probe into Russian interference in the 2016 election. Internal ethics rules require Justice Department officials to remove themselves from investigations in which they may be a witness.