Dwelling on our own suffering makes us blind to the pain of others
Family members of victims console each other as they gather to pay their respect at the reflecting pool at Ground Zero during the eighth anniversary commemoration ceremony / Reuters
On Sunday, New York will pause to remember and honor the victims who died in the attacks on the World Trade Center 10 years ago. Not as grandly as we did on the first anniversary of the attacks, of course. But that's as it should be. The wounds were fresh then, so the drama and emotion were both much higher. More than 3,000 people died in a single morning, and the images of people voluntarily jumping to their deaths is seared in our collective memory; a graphic reminder of just how horrific the attacks and their damage were. But the damage of 9/11 went beyond those actually killed. And the challenges facing the survivors are more complex.
Some of the people participating in the anniversary events in New York (and in others commemorating those killed on Flight 93 in Pennsylvania and at the Pentagon) will be literal survivors of those attacks. Others will be family members who were, by association, either emotional victims, or survivors, or both, depending on how you look at it. In truth, all Americans are peripheral survivors, in that we were all traumatized by the events of that day and had our lives impacted and changed by their fallout.
And yet, while all that is true, and the honoring and commemoration of our individual and collective loss is both legitimate and appropriate, we should still approach our identification with being victims or survivors with a healthy dose of caution.
At the end of June, I attended an unusual summit conference sponsored by Google Ideas, The Council on Foreign Relations, and the Tribeca Film Festival. Titled the "Summit Against Violent Extremism," it brought together some 200 people who had been involved in, or had been affected by, violent extremism of one kind or another, from Islamic jihadists to nationalist fighters, to gang members, to neo-Nazi skinheads, to Colombian jungle rebels.
The organizers separated the attendees into two groups: "survivors" and "formers" (formers being former extremists). All of the attendees were now working actively to combat violent extremism. And the stories of loss among the survivors were heart-rending. But their inclusion in the conference implied a bit of moral preaching to the "formers": we, the victims, plead with you, the perpetrators, to feel our pain. And one of the most striking moments of the conference, for me, came near the end, when one of the organizers asked a former Islamist fighter (now a soft-spoken Imam in a London mosque who works actively against violent extremism) if he'd ever had someone with a survivor's perspective speak at his mosque.
"I would like to make a couple of points," the Imam answered quietly. "First of all, I HAVE suffered. My little brother was killed, and I have lost 22 relatives in war. So," he said, gesturing to a survivor on the same panel, "I know about personal suffering in the same way as you have done."
That one, simple interchange conveyed two powerful and cautionary lessons about the hazards of victim and survivor-hood.
When tragedy or violence strikes us, we are victims of it. And if we survive it we are, by definition, survivors. I nearly died at the age of 20 when the car I was in was struck at high speed by an angry, drunk young man who'd just lost his job. The path back from that darkness, physically and emotionally, was painful and long. The good news is, humans are remarkably adaptable and resilient. You go on from tragedies. You just don't go on intact, or the same. And the self that you drag and pull forward from a tragedy feels (and sometimes is) so battered and imperfect that there can be great strength from acknowledging the injustice of what happened (I was a victim) and the difficulty of coming back from that (I am a survivor). It can help a battered soul heal.
But if those labels become part of our longer-term identities instead of just phases of healing, the focus on our own pain and suffering can blind us to the pain and suffering of others. The suffering of a mother whose innocent child was killed in the Twin Towers, while unique, is not more or less than the suffering of a mother whose innocent child was killed by a bullet or bomb, regardless of who fired it, dropped it or set it off, in Iraq, Pakistan or any other place in the world.
The interchange at the conference was also a cautionary reminder about the dark places where a sense of victimhood can lead. Many of the "formers" were also victims, and survivors, of injustice and violence of a different sort. But their righteous sense of their status as victims took them down a road where, at some point, any reaction became, therefore, justified.
Nahum Pachenik, one of the "formers" at the conference who described himself as "born into conflict" as the child of Israeli settlement pioneers near Hebron, even joked a bit about the victimhood rivalry between the Israelis and the Palestinians.
"The two sides have very similar thinking," he said. "[They say] 'We are the victim.' 'No, WE are the victim.'No. We are MORE the victim.'"
Victimhood is wonderfully appealing, Pachenik said, because "in the victim position, you don't have to admit anything, because all of the responsibility is on the other."
Nevertheless, Pachenik finally came to the conclusion that if he wanted to move away from the stalemate of violence around him, he had to give up the comfort of victimhood for the tougher and more challenging path of knowledge. He now runs an organization that strives to promote better knowledge and understanding between Palestinians and Israelis... starting with learning each other's language.
"Knowledge," Pachenik said, "is the opposite of the position of the victim. Today, I believe it is more important to promote education. It's important to learn the language of the other. Because if you do that, there is, maybe, a place to meet."
The victims of 9/11 who did not survive will always be victims, and should be honored and remembered as such. But even they wouldn't want to be remembered, or identified, solely by the label of "victim." As for the rest of us... well, we are survivors. But we are -- and need to be -- far more than that if we want to stop the cycle of violence that helps cause attacks like that in the first place. It's a tempering point worth remembering, even as we pause to honor the lives and memory of those who died.
On Tuesday, the late-night host once again devoted his show to the politics of American health care. This time, though, he offered indignation rather than tears.
“By the way, before you post a nasty Facebook message saying I’m politicizing my son’s health problems, I want you to know: I am politicizing my son’s health problems.”
That was Jimmy Kimmel on Tuesday evening, in a monologue reacting to the introduction of Graham-Cassidy, the (latest) bill that seeks to replace the Affordable Care Act. Kimmel had talked about health care on his show before, in May—when, after his newborn son had undergone open-heart surgery to repair the damage of a congenital heart defect, he delivered a tearfully personal monologue sharing the experience of going through that—and acknowledging that he and his family were lucky: They could afford the surgery, whatever it might cost. Kimmel concluded his speech by, yes, politicizing his son’s health problems: He emphasized how important it is for lower- and middle-class families to have comprehensive insurance coverage, with protections for people with preexisting conditions. “No parent,” he said, speaking through tears, “should ever have to decide if they can afford to save their child’s life. It shouldn’t happen.”
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective.
J.G. is a lawyer in his early 30s. He’s a fast talker and has the lean, sinewy build of a distance runner. His choice of profession seems preordained, as he speaks in fully formed paragraphs, his thoughts organized by topic sentences. He’s also a worrier—a big one—who for years used alcohol to soothe his anxiety.
J.G. started drinking at 15, when he and a friend experimented in his parents’ liquor cabinet. He favored gin and whiskey but drank whatever he thought his parents would miss the least. He discovered beer, too, and loved the earthy, bitter taste on his tongue when he took his first cold sip.
His drinking increased through college and into law school. He could, and occasionally did, pull back, going cold turkey for weeks at a time. But nothing quieted his anxious mind like booze, and when he didn’t drink, he didn’t sleep. After four or six weeks dry, he’d be back at the liquor store.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
Trump’s bellicosity undermines his ability to deter the Kim regime’s nuclear weapons and missiles programs.
How are we to make sense of the president of the United States—a man with unitary launch authority for over a thousand nuclear weapons—going before the United Nations General Assembly and threatening to annihilate a sovereign state? That’s exactly what President Donald Trump did on Tuesday, halfway into a long, winding speech on everything from sovereignty to UN funding. “The United States has great strength and patience, but if it is forced to defend itself or its allies, we will have no choice but to totally destroy North Korea,” Trump read carefully from his teleprompter. In one breath, he touted the virtues of the nation-state and sovereignty and, in another, promised the utter destruction of a sovereign state.
“If the world’s major powers can’t agree on what the UN is for, what does that mean for its future?”
Since the Second World War, American presidents have repeatedly gone before the United Nations General Assembly and made a similar argument: The United States has national interests just like any other country, but in the modern era those interests are increasingly international in scope and shared by people around the world, requiring more of the multilateral cooperation that the UN was founded to foster.
John F. Kennedy argued that nuclear weapons necessitated “one world and one human race, with one common destiny” guarded by one “world security system,” since “absolute sovereignty no longer assures us of absolute security.” Richard Nixon spoke of a “world interest” in reducing economic inequality, protecting the environment, and upholding international law, declaring that the “profoundest national interest of our time” is the “preservation of peace” through international structures like the UN. In rejecting tribalism and the walling-off of nations, Barack Obama asserted that “giving up some freedom of action—not giving up our ability to protect ourselves or pursue our core interests, but binding ourselves to international rules over the long term—enhances our security.” These presidents practiced what they preached to varying degrees, and there’s long been a debate in the United States about the extent to which America’s sovereign powers should be ceded to international organizations, but in broad strokes the case for global engagement was consistent.
Today’s young children are working more, but they’re learning less.
Step into an American preschool classroom today and you are likely to be bombarded with what we educators call a print-rich environment, every surface festooned with alphabet charts, bar graphs, word walls, instructional posters, classroom rules, calendars, schedules, and motivational platitudes—few of which a 4-year-old can “decode,” the contemporary word for what used to be known as reading.
Because so few adults can remember the pertinent details of their own preschool or kindergarten years, it can be hard to appreciate just how much the early-education landscape has been transformed over the past two decades. The changes are not restricted to the confusing pastiche on classroom walls. Pedagogy and curricula have changed too, most recently in response to the Common Core State Standards Initiative’s kindergarten guidelines. Much greater portions of the day are now spent on what’s called “seat work” (a term that probably doesn’t need any exposition) and a form of tightly scripted teaching known as direct instruction, formerly used mainly in the older grades, in which a teacher carefully controls the content and pacing of what a child is supposed to learn.
The bill would take funding from governments facing public-health crises to provide a short-term boon to a smaller number of states that have refused to expand Medicaid.
“Obamacare, for whatever reason, favors four blue states against the rest of us.” So South Carolina Senator Lindsey Graham, in a floor speech on Monday, defended the central rationale of his Obamacare replacement, the Graham-Cassidy bill. In that speech and other statements, Graham has cast his bill as a redistribution, taking federal Obamacare money poured into the liberal bastions of California, New York, Massachusetts, and Maryland, and giving some of it to cash-strapped red states that have been left out, and whose sicker populations have languished. In this telling, Graham is Robin Hood, and his co-sponsors Bill Cassidy of Louisiana, Dean Heller of Nevada, and Ron Johnson of Wisconsin are his merry men.
Donald Trump used his first address at the United Nations to redefine the idea of sovereignty.
Donald Trump’s first speech to the United Nations can best be understood as a response to his predecessor’s final one. On September 20, 2016, Barack Obama told the UN General Assembly that “at this moment we all face a choice. We can choose to press forward with a better model of cooperation and integration. Or we can retreat into a world sharply divided, and ultimately in conflict, along age-old lines of nation and tribe and race and religion.”
Three hundred and sixty-four days later, Trump delivered America’s answer: Option number two. His speech on Tuesday turned Obama’s on its head. Obama focused on overcoming the various challenges—poverty, economic dislocation, bigotry, extremism—that impede global “integration,” a term he used nine times. Trump didn’t use the term once. Obama used the word “international” 14 times, always positively (“international norms,” “international cooperation,” “international rules,” “international community”). Trump used it three times, in each case negatively (“unaccountable international tribunals,” “international criminal networks,” “the assassination of the dictator's brother using banned nerve agents in an international airport”) Obama warned of a world “sharply divided… along age-old lines of nation and tribe and race and religion.” Trump replied by praising “sovereignty” or invoking “sovereign” no fewer than 19 times. And while he didn’t explicitly defend divisions of “tribe and race and religion,” he talked about the importance of nations “preserving the cultures,” which is a more polite way of saying the same thing.
A new book by the economist Tim Harford on history’s greatest breakthroughs explains why barbed wire was a revolution, paper money was an accident, and HVACs were a productivity booster.
In the beginning, it wasn’t the heat, but the humidity. In 1902, the workers at Sackett & Wilhelms Lithographing & Printing Company in New York City were fed up with the muggy summer air, which kept morphing their paper and ruining their prints. To fix the problem, they needed a humidity-control system. The challenge fell to a young engineer named Willis Carrier. He devised a system to circulate air over coils that were cooled by compressed ammonia. The machine worked beautifully, alleviating the humidity and allowing New York’s lithographers to print without fear of sweaty pages and runny ink.
But Carrier had a bigger idea. He recognized that a weather-making device to control humidity had even more potential to control heat. He went on to mass-manufacture the first modern air-conditioning unit at the Carrier Corporation (yes, that Carrier Corporation), which is still one of the largest HVAC manufacturers in the world. Air-conditioning went on to change far more than modern printing—it shaped global productivity, migration, and even politics.