In the four days of public mourning and recognition devoted to the late Senator Edward M. Kennedy, a few themes emerged. His devotion to the liberal cause. His checkered political and personal life. His devotion to his family, and the families of his brothers. His faith, laughter, and love of life. But also, his consummate skill as a legislator who had a legendary ability to get things done, in no small part because he was respected on both sides of the aisle and had developed friendships with many his political adversaries. He had, as Republican Orrin Hatch of Utah noted, a unique talent for seeking out common ground with an adversary, no matter how small that space was, and then working to get something accomplished through that place of shared priorities or perspective.
One could regard Kennedy's acts as shrewd Machiavellian maneuvering. But politicians are skilled enough in that art to recognize the difference between authentic connection and political expediency, and the friendship and sorrow on the faces of Kennedy's Republican colleagues these past few days resonated as something very authentic.
So it appears that Kennedy was a master at truly seeking and finding common ground; better than many of us seem to be, these days. Why is that? Many reasons. But part of the answer may be that finding common ground first requires a deep and compassionate understanding and acceptance of the idea that humans are complex, multi-dimensional creatures, as multi-faceted as any cut diamond. And beyond that, an understanding of how seemingly irreconcilable characteristics and beliefs can coexist within a single person.
We all learn, without ever being told, that people have many characteristics, some of which we like better than others. Most siblings understand that one that before the age of six. But fewer of us have to wrestle with the far more difficult mix posed by a person who at once exhibits beliefs or characteristics we find admirable, along with others we find abhorrent. In most cases, if we see evidence of a belief or character trait we find that objectionable, we steer clear. And our ability to keep our distance from those we dislike has grown in the past few decades.
In a city apartment, it's hard not to deal with your neighbors. Even front-porch America forced a bit more neighborly interaction. But with the advent of the backyard deck, the automatic garage-door opener and the suburban sprawl of gated communities, we gained a far greater ability to separate ourselves from others unless we expressly chose to socialize with them. And that trend of specialization has grown. We can now not only get 200 narrowly focused cable or satellite channels; we can also choose from thousands and thousands of narrowly-focused blogs and websites for our "news." No matter how arcane our points of view, we can find and immerse ourselves in a like-minded community through chat rooms and forums around the internet world. Many more of us telecommute, reducing our need to learn to cope with co-workers whose views don't mirror ours. We don't even have to cope with listening our way through tracks on an album or CD we don't like in order to get to our favorites. We just download the individual songs we want.
In short, it's increasingly possible to live our lives in a "silo" of like-minded thought, music, entertainment and personalities. The problem with this, of course, is that it isolates us from those who would teach us difficult and uncomfortable truths about human complexity and, through that, the art of finding common ground.
A number of years ago, I found myself living for a time in the middle of a social and professional circle where nobody else shared my worldview, or my opinions on most subjects. If I had had more options for social interaction or friendship, I probably wouldn't have spent much time getting to know the people in the group very well. But because my social options were limited, I had to look for some kind of connection or common ground. And as I got to know some of the individuals better, I saw tremendous acts of kindness and generosity, deep and heartfelt fears and sorrows, and traits of loyalty, honesty, and integrity that were both admirable and authentic.
The tough part was that in those same people, I also saw acts, and heard opinions, that were deeply abhorrent to me. Acts, phrases, insults and opinions that I would willingly spend a lifetime fighting to overcome. How could such diametrically opposed traits coexist in a single person? And how could I reconcile my admiration for parts of a person with my visceral opposition to other pieces of the puzzle?
There was no running from the question. I confronted it daily, in all my interactions. I wrestled through outrage, generalization and judgment. I tried to change their opinions through argument. But in the course of that struggle, I also slowly gained new understanding, not just of how complex humans are, but of how few people are all right, or all wrong, or without merit or fault. And that just as my admiration of a person's strengths did not mean I had to condone other traits or opinions I vehemently opposed, neither did my dismay at those traits negate the person's other strengths.
In the end, I came to some kind of peace with the possibility of agreeing and disagreeing with someone else, all at the same time. Of understanding and respecting a little bit better how they came to see the world the way they did, even as I continued to argue for a different set of attitudes, priorities, or rules. Of getting beyond a global "good guy/bad guy" dichotomy to a more nuanced place and perspective about how we all end up with such different takes on the world. As the philosopher/writer Joseph Campbell said, "One has to go beyond the pairs of opposites to find the real source ... When you have come past the pairs of opposites, you have reached compassion."
There's undoubtedly more to the equation, of course. Senator Kennedy also came from an era in politics and Congressional life without televised hearings and the grandstanding that evolved from that, or a 24/7 media culture that rewards simplistic sound-bites over complex and nuanced positions, negotiations, or approaches. Perhaps if we want more bipartisanship among our politicians, we have to turn off the cameras and grant them a lot more privacy in which to develop more nuanced relationships.
But fundamental to forging those relationships; to finding the small spaces of common ground upon which they can be built, is first gaining an understanding and acceptance of the many and oftentimes disparate facets that can coexist in another person. Of gaining a deep and authentic respect and compassion for the whole of a person that allows genuine friendship, and an open spirit of alliance on the 10 percent of shared purpose despite 90 percent of adamantly held opposition.
Fewer of us these days have to wrestle our way to a deep or intimate understanding of those human complexities. But if Kennedy was a master at the art, perhaps it's in part because he surely wrestled with that question every time he looked in the mirror. In coming to some measure of understanding or compassion about his own behavior and past, perhaps he developed a deeper acceptance of the complexity, differences and imperfections of others, as well. And a sense of compassion beyond simple opposites that not only led to some of his greatest achievements, but is surely one of the qualities his colleagues, on both sides of the aisle, will miss the most.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
Shedding pounds is usually a losing battle—research suggests it’s better to just focus on building a healthy lifestyle.
“My own history of yo-yo dieting started when I was 15 and lasted about three decades,” said Sandra Aamodt, a neuroscientist and the author of Why Diets Make Us Fat, at the Aspen Ideas Festival on Saturday. “I lost the same 15 pounds pretty much every year during that same period, and gained it back regular as clockwork.”
This is a classic tale—the diet that doesn’t take, the weight loss that comes right back. The most recent, extreme, highly publicized case was that of the study done on contestants from the reality show The Biggest Loser, most of whom, six years after losing 100 to 200 pounds, had gained most of it back, and had significantly slowed metabolisms.
The study provided a dramatic example of how the body fights against weight loss. And sheer force of will is rarely sufficient to fight back.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
The kerfuffle over Kim Kardashian's drug-promoting Instagram selfie is nothing new: As long as the agency has existed, it's had to figure out how to regulate drug advertisements in new forms of communication technology.
Last month, celebrity-news and health-policy bloggers had a rare moment of overlap after the Food and Drug Administration issued a warning letter to the pharmaceutical company Duchesnay, which manufactures Diclegis, a prescription-only anti-nausea pill. At stake: a single selfie with pill bottle.
The image that attracted the censure of the FDA was an Instagram posted on July 20 by Kim Kardashian. The image featured her upper torso, right hand, and face, with a bottle of Diclegis prominently displayed in her grasp. “OMG,” the caption began:
Have you heard about this? As you guys know my #morningsickness has been pretty bad. I tried changing things about my lifestyle and my diet, but nothing helped, so I talked to my doctor. He prescribed my Diclegis, I felt better, and most importantly it’s been studied and there is no increased risk to the baby.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.
The U.K.’s vote to leave the European Union betrays a failure of empathy and imagination among its leaders. Will America’s political establishment fare any better?
If there is a regnant consensus among the men and women who steer the Western world, it is this: The globe is flattening. Borders are crumbling. Identities are fluid. Commerce and communications form the warp and woof, weaving nations into the tight fabric of a global economy. People are free to pursue opportunity, enriching their new homes culturally and economically. There may be painful dislocations along the way, but the benefits of globalization heavily outweigh its costs. And those who cannot see this, those who would resist it, those who would undo it—they are ignorant of their own interests, bigoted, xenophobic, and backward.
So entrenched is this consensus that, for decades, in most Western democracies, few mainstream political parties have thought to challenge it. They have left it to the politicians on the margins of the left and the right to give voice to such sentiments—and voicing such sentiments relegated politicians to the margins of political life.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Thoughts on the first episode of ESPN’s five-part documentary
Every fall Sunday, when I was a kid, half an hour before the pre-game shows and an hour before the games themselves, I would tune into the latest offering from NFL Films. This was the pre-pre-game show—an assembly of short films derived from the massive archive of professional football. Steve Sabol, whose father founded NFL Films, would preside. He’d offer and then throw it to Jon Facenda or Jefferson Kaye, who would narrate the career highlights of players likeGale Sayers, Earl Campbell, or Dick “Night Train” Lane.
“Highlights” understates what NFL films was actually doing. The shorts were drawn from some the most beautifully shot footage in all of sports. It wasn’t unheard of for NFL Films to go high concept—this piece on football and ballet, with cameos from Allen Ginsberg and George Will, may be the definitive example. Great football plays would be injected not with the normal hurrahs, but with poetry. When Facenda, for instance, wanted to introduce a spectacular touchdown run by Marcus Allen, he did so in the omniscient third person: “On came Marcus Allen—running with the night.”