Fifteen years after we passed welfare reform, did it work? Ezra Klein takes on this question today.
If welfare reform was meant to cut the rolls, then it definitely worked. And if it was meant to give states the flexibility to cut their spending on the program, it definitely worked. . . If you think the point of the program is to help the poor, then no, welfare reform is not working.
As Jake Blumgartwrites at The American Prospect, the reformed program "has failed to cushion the neediest through recessions. While in 2009 the food-stamp program responded to the increased need for government assistance, growing by 57 percent, the number of TANF caseloads merely inched upward...At the heart of the worst recession in 80 years, TANF funds only reached 4.5 million families, or 28 percent of those living in poverty. By contrast, in 1995, the old welfare system covered 13.5 million families, or 75 percent of those living in poverty."
Another possible definition of "working" is that the program has helped or forced a lot of low-income Americans, and particularly single mothers, find jobs. In the late-1990s, when the labor market was very tight, there's strong evidence that welfare reform was helpful in pushing people into the job market. In the Aughts and, in particular, since the recession has hit, it's a lot less clear that welfare reform is increasing employment rather than simply limiting support for the unemployed.
Ezra includes this graph from the Center on Budget and Policy Priorities as evidence that welfare reform has not "worked" in any real sense. As you can see, the percentage of poor families receiving TANF (the successor to AFDC, aka "welfare") has fallen dramatically since welfare reform was enacted.
But I'm not sure why this is supposed to be an indictment of the system. Why is it a problem that fewer poor families are enrolled in a program that is only open to people who aren't working? The American Prospect and the Center on Budget and Policy Priorities can't possibly be lamenting the fact that we no longer have more than 70% of our poorest families on a program that has unemployment as a prerequisite. But the way this graph is used makes it sound like they consider this regrettable.
Ezra mentions that reform moved many welfare mothers into jobs, but I think he gives this short shrift. Leave aside the tiresome bourgeois morality which wants to see people trying to support themselves before they turn to the generosity of their neighbors. People are not made better off by a program that encourages them not to work--as AFDC indisputably did, given the decline in the rolls.
Don't get me wrong: it's entirely understandable that people would prefer to collect welfare rather than work long hours at an unpleasant low-wage job. But someone who collects welfare today rather than go to work for $7 an hour is very likely to be collecting welfare ten years from now, when it will still be a rather joyless existence hemmed in by lack of money and the whims of the bureaucracy. Someone who is working at anything has their feet on a path that might actually lead somewhere. As anyone who has suffered through a long spell of unemployment can attest, it's hard to get back into the workforce if you've been out of it a while. Harder still if you were never really in it, developing basic skills like showing up on time every day and handling difficult customers.
Welfare enabled people to make bad long-term decisions that were rational short-term choices. Welfare reform changed that. That's good news.
Of course, it's bad news that the mothers who went out to work didn't all gain the comfortable middle class existence we'd ultimately like for them. But there was still a noticeable decline in the number of poor families that persisted even into the early years of the Great Recession:
This looks like a modest but real success to me at weaning families from welfare dependency. Even at the nadir of the worst recession in eighty years, the percentage of families in poverty--as well as the percentage of families on TANF--was below pre-reform levels. Unless you really think that these families would be better off spending the rest of their lives on the dole, this seems like a real achievement.
There's another reason that progressives should celebrate: changing the structure of welfare has eroded much of the opposition to it. As long as people felt like welfare was a way for people to simply live off of tax dollars without working, there was bound to be a lot of opposition to the program. Restructuring it as temporary assistance for those who are overwhelmed by unexpected circumstances has essentially whittled that opposition down to nothing. When was the last time that welfare came up in an election?
Sure, maybe progressives would prefer that a generous system of benefits for anyone who wanted them was the uncontroversial norm--but that doesn't really seem very realistic in a pluralistic and fairly conservative country like America. By ending welfare as we knew it, Clinton preserved the safety net for people who really can't cope. If he hadn't, welfare mothers would now be competing with retirees for money in the Great Deficit Reduction Olympics. And I think we all know who would have lost that race.
Update: several commenters think I should have included the two sentences now at the end of the Ezra Klein quote, which I initially left out because I was already in danger of grabbing the whole post. They think it changes my post. I disagree, because my point remains the same: Ezra is giving short shrift to the succesful drive to move people into work. But I can also see why people felt like my clip was misleading, so I've added it, and sorry, Ezra, for grabbing nearly your whole post.
Now about those sentences . . .
After saying "If you think the point of the program is to help the poor, then no, welfare reform is not working", Ezra acknowledges that it was nice that people moved into work in 2000s, but dismisses this achievement because the trend did not continue to steadily decline towards zero. This is pretty much the standard progressive line on welfare reform--it only looked like it was working because of the awesome Clinton economy--and it's not correct.
It's not, in fact, in question whether we produced a permanent change; we did. There was a substantial structural decline in the percentage of families in poverty which persisted into the aughts. I could have included the percentage of female headed families in poverty, or children in poverty, and they would have shown the same trend: all of them clearly inflected downwards around welfare reform. All ticked up during the 2001 recession, but clearly settled at a level much lower than their pre-reform average. I find this hard--actually, impossible--to square with Klein's assertion that if you think the purpose of reform was to help needy families, then no, it hasn't worked.
Dismissing the achievements of welfare reform because the poverty rate didn't decline towards zero makes no sense to me. While it would be nice if it had happened, no one really expected it to. The fact that a miracle failed to materialize is hardly a searing indictment of reform. You can argue that the decline in the poverty rate was assisted by other reforms like boosting the earned income tax credit, and I completely agree. But boosting the EITC does nothing to help people who aren't earning income. If we hadn't done welfare reform, "not earning income" would still describe the majority of poor families.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The tension between religious liberty and same-sex marriage may eventually come to a head in the courts, but probably not through the Kentucky clerk’s case.
As Rowan County clerk Kim Davis crawls further and further out on a limb, Supreme Court experts agree that she has little chance of prevailing. District Judge David Bunning, on August 12 ordered Davis, in her capacity as county clerk, to issue marriage licenses to all couples who meet the statutory criteria for marriage in Kentucky—a definition that, since the Court’s landmark decision in Obergefell v. Hodges, includes same-sex couples.
Davis has refused, citing “the authority of God.” The U.S. Supreme Court, without comment, denied her emergency request for a stay. This throws the case back to the Sixth Circuit, which will hear the appeal of Judge Bunning’s order. Assuming she loses in the Sixth Circuit—a fairly good assumption—she would then have the alternative of petitioning the Supreme Court to hear her religious freedom claim. The Court will eventually hear a case about religious freedom and same-sex marriage, but I don’t think it will be this one.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
The past is beautiful until you’re reminded it’s ugly.
Taylor Swift’s music video for “Wildest Dreams” isn’t about the world as it exists; it’s about the world as seen through the filter of nostalgia and the magic of entertainment. In the song, Swift sings that she wants to live on in an ex’s memory as an idealized image of glamour—“standing in a nice dress, staring at the sunset.” In the video, her character, an actress, falls in love with her already-coupled costar, for whom she’ll live on as an idealized image of glamour—standing in a nice dress, staring at a giant fan that’s making the fabric swirl in the wind.
The setting for the most part is Africa, but, again, the video isn’t about Africa as it exists, but as it’s seen through the filter of nostalgia and the magic of entertainment—a very particular nostalgia and kind of entertainment. Though set in 1950, the video is in the literary and cinematic tradition of white savannah romances, the most important recent incarnation of which might be the 1985 Meryl Streep film Out of Africa, whose story begins in 1913. Its familiarity is part of its appeal, and also part of why it’s now drawing flack for being insensitive. As James Kassaga Arinaitwe and Viviane Rutabingwa write at NPR:
Though it wasn’t pretty, Minaj was really teaching a lesson in civility.
Nicki Minaj didn’t, in the end, say much to Miley Cyrus at all. If you only read the comments that lit up the Internet at last night’s MTV Video Music Awards, you might think she was kidding, or got cut off, when she “called out” the former Disney star who was hosting: “And now, back to this bitch that had a lot to say about me the other day in the press. Miley, what’s good?”
To summarize: When Minaj’s “Anaconda” won the award for Best Hip-Hop Video, she took to the stage in a slow shuffle, shook her booty with presenter Rebel Wilson, and then gave an acceptance speech in which she switched vocal personas as amusingly as she does in her best raps—street-preacher-like when telling women “don’t you be out here depending on these little snotty-nosed boys”; sweetness and light when thanking her fans and pastor. Then a wave of nausea seemed to come over her, and she turned her gaze toward Cyrus. To me, the look on her face, not the words that she said, was the news of the night:
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
How the Islamic State uses economic persecution as a recruitment tactic
Before Islamic State militants overran her hometown of Mosul in June 2014, Fahima Omar ran a hairdressing salon. But ISIS gunmen made Omar close her business—and lose her only source of income. Salons like hers encouraged “debauchery,” the militants said.
Omar is one of many business owners—male and female—who say ISIS has forced them to shut up shop and lose their livelihoods in the process. The extremist group has also prevented those who refuse to join it from finding jobs, and has imposed heavy taxes on civilians.
“ISIS controls every detail of the economy,” says Abu Mujahed, who fled with his family from ISIS-controlled Deir al-Zor in eastern Syria. “Only their people or those who swear allegiance to them have a good life.” When they took over Deir al-Zor, ISIS gunmen systematically took control of the local economy, looting factories and confiscating properties, says Mujahed. Then they moved in, taking over local business networks.
A Brooklyn-based group is arguing that the displacement of longtime residents meets a definition conceived by the United Nations in the aftermath of World War II.
No one will be surprised to learn that the campaign to build a national movement against gentrification is being waged out of an office in Brooklyn, New York.
For years, the borough’s name has been virtually synonymous with gentrification, and on no street in Brooklyn are its effects more evident than on Atlantic Avenue, where, earlier this summer, a local bodega protesting its impending departure in the face of a rent hike, put up sarcastic window signs advertising “Bushwick baked vegan cat food” and “artisanal roach bombs.”
Just down the block from that bodega are the headquarters of Right to the City, a national alliance of community-based organizations that since 2007 has made it its mission to fight “gentrification and the displacement of low-income people of color.” For too long, organizers with the alliance say, people who otherwise profess concern for the poor have tended to view gentrification as a mere annoyance, as though its harmful effects extended no further than the hassles of putting up with pretentious baristas and overpriced lattes. Changing this perception is the first order of business for Right to the City: Gentrification, as these organizers see it, is a human-rights violation.