Toward the end of his life, MLK's focus began to shift from ensuring racial equality to bridging the economic divide between the rich and poor
The timing was coincidental enough to be eerie. But just as crowds gathered in Washington, D.C. last Friday to dedicate the site for a new memorial on the Mall to Martin Luther King, Jr., I stumbled across the April, 19, 1968 issue of Life magazine among a mountain of papers, books and magazines I was clearing out of my parents' house in New York. It was one of only two issues of Life magazine my mother had kept. But on the cover was a close-up of Coretta Scott King, "beautiful and veiled in grief," as the writer Gordon Parks described her, at the funeral of her husband. And the coverage inside talked not only of Martin Luther King' Jr.s death and its aftermath, but also about the legacy and work he was leaving behind him.
There was, of course, discussion of the work he focused on in his "I Have A Dream" speech, given on the steps of the Lincoln Memorial on August 28, 1963. (The public dedication of the new Memorial was originally scheduled for yesterday, the 48th anniversary of that speech, but Hurricane Irene forced organizers to postpone it.) But by 1968, both the Civil Rights Act and the Voting Rights Act had been passed, and King's focus was shifting from the basic cause of social and political equality for black people to the broader issue of economic equality -- for all poor people, regardless of race.
In his 1967 book Where Do We Go From Here, King noted that there were twice as many white poor as black poor people in the United States. "Therefore," he wrote, "I will not dwell on the experiences of poverty that derive from racial discrimination." Instead, he argued for better jobs, wages, housing, and education for all people suffering in poverty.
The Life editors also spoke of the "poor people's campaign" King was planning when he died. And In an article about a speech Coretta Scott King had given in his place, the day before his funeral service, Life quoted her as saying about her late husband,
He was concerned about the least of these (workers)... We are concerned about not only the Negro poor, but the poor all over America and all over the world. Every man deserves a right to a job or an income so that he can pursue liberty, life, and happiness. Our great nation, as he often said, has the resources, but his question was: "Do we have the will?" Somehow I hope in this resurrection experience the will will be created within the hearts and minds, and the souls and the spirits of those who have the power to make these changes come about.
Forty-three years later, with an African-American president sitting in the White House, it's easy enough to argue that significant progress has been made on the front of racial equality. But what of King's other dream -- of easing the burdens of the poor in a more equitable economic society?
In 1968, roughly 12-13 percent of the country was living below the poverty level. Today, that number is virtually unchanged. What's more, the disparity in income between the richest and poorest Americans has increased over the past decades. A 2010 Slate series on income inequality noted that in 1915, the richest 1 percent of Americans possessed 15-18 percent of the nation's income, and that today, that number has risen to 24 percent. And a few months ago, a PBS News Hour piece headlined "Income Inequality Gap Widens Among U.S. Communities Over 30 Years" looked more closely at the growing disparity of income by area in America.
Accompanying those hard numbers is an arguable hardening of attitude toward those less well off in the country. Perhaps we all feel closer to the edge than we did in the 1960s, and therefore less inclined to even the tables. But the sense of people taking care of themselves, as opposed to their neighbors, is far stronger today than it was when King was assassinated. It's hard to imagine today's Congress passing the Social Security Act of 1965, which raised Americans' taxes in order to make both Medicare and Medicaid possible.
The U.S. still has astounding financial resources. But the "will" Coretta Scott King talked about in that April, 1968 Life article still seems to elude us. Would King himself have been able to make a difference on that front, if he had lived? It's hard to say. But reading through that issue of Life, I was reminded again of the power Dr. King possessed to calmly but resolutely tweak the nation's conscience.
"King," the Life editors wrote, "insisted on the enlargement of the American dream of equality. Steady enlargement is the way it has always been kept alive... He bade white Americans face their simple duty of living up to their own best traditions in a context they had not been accustomed to... He asked to be remembered as a 'drum major for justice... for peace ... for righteousness.' Those old-fashioned abstractions have the force of continuity with what Americans have stood for, and often fought for, since their beginning. King insisted on non-violent means because he took the Sermon on the Mount seriously. But he attracted and defied violence because he took America seriously, and that can be a daring and unpopular thing to do."
King never tried to be a politician, necessarily mired in the messy, compromising bogs of campaigning or governance. His chosen role, instead, was to make it difficult for politicians to ignore his voice; a voice that argued convincingly for what was right; for what was just; and for how we needed to be, and could be, better. Not better off, but better members of the human race.
Would King's voice have made a difference in the economic inequality of today, or the tone of the debates raging over health care, taxes, and who should bear the burden for what? It's hard to say. But as the site for his memorial is dedicated in Washington, it's worth pondering his other dream... what he would have made of the arguments being waged over it today, and whether he would have thought us closer to, or further from, our better selves than we were the day he died.
FEMA Director Craig Fugate on why the Katrina response failed, why it’s important to talk about “survivors” instead of “victims,” and why citizens can’t just wait for the government to save them in a huge disaster
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
First, he was tweeting about peas and guacamole. Then he was making playlists on Spotify. Now, he’s commenting on Facebook posts.
President Obama left a message on the above post by Humans of New York, a popular photo blog run by Brandon Stanton, whose content oscillates between heartwarming and heartbreaking. Here’s the post, with the signature “-bo” signoff the president uses on social media to tell users it’s the real deal.
The post comes a day after the Obama administration secured enough Senate support to guarantee Congress cannot block the nuclear deal brokered with Iran in June. And if you choose to read into its meaning, the post is the latest move by Obama to interact with Iran in ways that other recent U.S. presidents have not. In 2013, Obama spoke with Iranian President Hassan Rouhani by phone, marking the first conversation between U.S. and Iranian heads of state in 30 years.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
A tattooed, profanity-loving Lutheran pastor believes young people are drawn to Jesus, tradition, and brokenness.
“When Christians really critique me for using salty language, I literally don’t give a shit.”
This is what it’s like to talk to Nadia Bolz-Weber, the tattooed Lutheran pastor, former addict, and head of a Denver church that’s 250 members strong. She’s frank and charming, and yes, she tends to cuss—colorful words pepper her new book, Accidental Saints. But she also doesn’t put a lot of stock in her own schtick.
“Oh, here’s this tattooed pastor who is a recovering alcoholic who used to be a stand-up comic—that’s interesting for like five minutes,” she said. “The fact that people want to hear from me—that, I really feel, has less to do with me and more to do with a Zeitgeist issue.”
Climate change means the end of our world, but the beginning of another—one with a new set of species and ecosystems.
A few years ago in a lab in Panama, Klaus Winter tried to conjure the future. A plant physiologist at the Smithsonian Tropical Research Institute, he planted seedlings of 10 tropical tree species in small, geodesic greenhouses. Some he allowed to grow in the kind of environment they were used to out in the forest, around 79 degrees Fahrenheit. Others, he subjected to uncomfortably high temperatures. Still others, unbearably high temperatures—up to a daily average temperature of 95 degrees and a peak of 102 degrees. That’s about as hot as Earth has ever been.
It’s also the kind of environment tropical trees have a good chance of living in by the end of this century, thanks to climate change. Winter wanted to see how they would do.