There's been a lot of discussion, this week, about whether President Obama has fulfilled enough promises or expectations of change since his election a year ago. "I voted for him, and I really thought everything would be different," one disappointed voter from Iowa said in a televised interview.
It would be easy to dismiss the expectations of such voters as unrealistic or naive, but we often expect more from big watershed events, and in more sweeping, immediate fashion, than life dishes out. Consider, for example, another important anniversary coming up on Monday: the 20th anniversary of the fall of the Berlin Wall.
On November 9, 1989, after weeks of protest and slow chiseling away of the East German Politburo's power, the East German government announced that henceforth, East Berliners could travel freely to the west. Faced with massive crowds at the border checkpoints, the guards opened the gates, and people streamed through. A party erupted on top of the wall, and people started hacking away at it with hammers and picks.
It was a celebration and global party; the end of an era that had brought incalculable pain to millions of Germans separated from family members and death to thousands, over the years, who had tried to cross over to the west anyway. I wrote about some of the sacrifices, and the lingering legacy of the Wall, in an essay on this site last May, after a German artist released an exhibit sparked by the anniversary of the Wall's demise.
Given all the damage and fear it caused, the fall of the Wall was truly an historical watershed moment to cheer. But then the celebration and festivities ended, and the real work of reunification began.
In the moment of celebration, it seemed all good. The rift was healed; the country would be united again. Cue the trumpets and national anthem. Roll credits. If the story had been a Hollywood movie, it actually would have ended there, because the morning after is always messier and less satisfying than the triumphant night. Every screenwriter worth their salt knows that.
There were, of course, some things that did change immediately. People could travel back and forth across the border. Restrictions ended. But national, economic, and social integration and change proved far more challenging than perhaps even anyone in 1989 would have predicted.
Many in the West resented the tax they had to pay to upgrade the infrastructure, buildings, and resources in the east. And "Ossies" (Easterners, from the German word "Ost" for East) found themselves in a no-man's land between cultures. They were suddenly without the social security of the Russian/East German state system, but were still often considered second-class citizens by their western counterparts. Their knowledge of Russian and German didn't help them in an economic world where English had become the common language. For all the celebration on November 9th, change brought with it a disruption of the world they'd known ... and gave rise to fear.
In 2004, 15 years after the Wall came down, I spent some time in the eastern German village of Krausnick, in the Spreewald, or Spree Forest. Krausnik was founded in 1004, so it had seen a lot of changes. It has also seen a lot of battles. In 1945, more than 30,000 German soldiers and 10,000 civilians from the area were caught by the advancing Russian army and slaughtered over the course of a week. Looking at some of the dilapidated houses and crumbling stone walls in the area, I could imagine the Russian soldiers advancing over the land, and the terror that sight must have bred.
One would think, after a massacre so terrible, that the Russians would have been hated forever. But when I visited, there was still a memorial in the center of the village celebrating the Red Army heroes who had died there "in the war against Fascism, 1941-1945." The same soldiers, mind you, who had killed so many of the local people. And as I watched, a couple of older villagers carefully cleaned the memorial and planted new flowers in front of it. The Russians had been gone for 15 years. And still the villagers preserved the memorial with loving care.
When I asked about it, several people told me that, in truth, they actually missed the Russians, because at least then, you had security. You didn't have to worry about losing your job or not being able to pay your rent. All you had to do was keep your head down and your nose clean. It was nice, they acknowledged, to not have to wait 20 years for a bad car. But you had new burdens of figuring out how to pay for that car, now.
Twenty years after the Berlin Wall fell, Germany is still struggling to fulfill the promise of that event. And that's a change that, at least in theory, everyone in Germany wanted. Imagine if the country had been deeply split on the basic premise of reunification?
Consider the events of July 2, 1964. On that date, President Johnson signed the Civil Rights Act into law. Its passage was the result of years of effort and struggle, and the signing of that Act separated American history into "before" and "after." As with the Berlin Wall, some things changed immediately. On July 3, 1964, discriminating against a person on the basis of the color of his or her skin was suddenly illegal. But life did not change with the stroke of a pen. Four years later, not only had glorious change not triumphantly come to pass, but both Martin Luther King--who had been present at the signing ceremony--and Robert Kennedy, one of the Cvil Rights movement's heroes ... were dead at the hands of assassins.
Change--especially nationwide change--is a slow-moving train. Power shifted with the Civil Rights Act, and the wheels of change were set in motion. But a year later, a black person--especially in the South--might not have noticed that much tangible difference in their life. Forty-five years later, we still fight some of those battles, as the events of the past year have certainly illustrated.
Symbolically, many things can change in a day. A law is passed, a wall comes down, a couple gets married, or a person is elected President. The event that initiates the change is called a watershed, because it marks the moment and place where the course of things turn in a new direction. But even in the best of circumstances, it takes a while after that event for any visible shift to become evident. Especially in a deep, complex, and layered environment where all the currents aren't headed in the same direction.
It's a point worth remembering. Too often, we look to those big, symbolic events as magical tonics that will change everything overnight--maybe because we were fed so many "and then they lived happily ever after" endings. More than one person has imagined that when they got married (or became a parent, or got that new job, or that new ... fill in the blank) they'd magically become happy ... only to discover that it takes a lot of work, patience, and time to make the promise of that symbolic change anything close to real.
The truth is, even events as big as the demise of the Berlin Wall don't change a country or the world overnight. They just make a new kind of change possible. Even if the journey turns out to be longer, rockier, and more complex than we wished or imagined ... or a Hollywood screenwriter would have written it.
Photo Credit: Flickr User antaldaniel, wikimedia commons
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Encouraging a focus on white identity is a dangerous approach for a country in which white supremacy has been a toxic force.
Donald Trump and the disaffected white people who make up his base of support have got me thinking about race in America. “Trump presents a choice for the Republican Party about which path to follow––” Ben Domenech writes in an insightful piece at The Federalist, “a path toward a coalition that is broad, classically liberal, and consistent with the party’s history, or a path toward a coalition that is reduced to the narrow interests of identity politics for white people.”
When I was growing up in Republican Orange County during the Reagan and Bush Administrations, lots of white parents sat their kids in front of The Cosby Show, explained that black people are just like white people, and inveighed against judging anyone by the color of their skin rather than the content of their character. The approach didn’t convey the full reality of race as minorities experience it. But it represented a significant generational improvement in race relations.
An image of a small child evokes an unfathomably huge tragedy.
I had just dropped my son off at daycare when I opened Twitter and came across a photo that over the next 24 hours would become a totem of the refugee crisis in Europe and the Middle East, and the blight that is the Syrian civil war. The picture would quickly reappear, this time as an earnest social-media meme, at a meeting of dithering UN officials and a gathering of unfeeling Arab leaders: a small Syrian boy in a red shirt, blue shorts, and worn shoes, lying face down in wet sand, his head cocked to one side along a gray, glistening shoreline, his lifeless hands cupped upwards, his knees slightly bent.
My first reaction was despair. My second was: My son sleeps just like that.
The attention this photo has received has generated discomfort as well as indignation—for understandable reasons. There are important ethical questions surrounding the taking or sharing of photos of children, dead or alive, in the media, including questions about the intent of the sharers and the consent of the subject. The scale of the Syrian tragedy is orders of magnitude greater, and infinitely more variegated, than this one picture, or this one victim’s story, can possibly convey. Over the last four and a half years, an estimated 240,000 people have died in the grinding violence, including nearly 12,000 children. More than half of Syria’s pre-war population—half, the proportional equivalent of nearly 170 million Americans—have been forced to flee their homes, spawning the largest exodus of refugees in a generation. Seven hundred and fifty thousand Syrian children won’t be going back to school this fall.
After a lackluster summer, the famous neurosurgeon is finally surging—but his reliance on the conservative grassroots might be a burden as much as a boon.
The Ben Carson surge that everyone was waiting for is finally here.
The conservative neurosurgeon has been a source of fascination for both the Republican grassroots and the media ever since he critiqued President Obama, who was seated only a few feet away, at the National Prayer Breakfast in 2013. He’s been a steady, if middling, presence in GOP primary polls for most of the year—always earning at least 5 percent, but rarely more than 10. Yet over the last two weeks, Carson has secured a second-place spot after Donald Trump, both nationally and in the crucial opening battleground of Iowa, where he is a favorite of the state’s sizable evangelical community. A Monmouth University poll released this week even showed him tied with Trump for the lead in Iowa, at 23 percent.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
By reorienting the GOP’s foreign-policy debate away from the Middle East, the flamboyant frontrunner took the pact off the front page.
Next week, Donald Trump will join Ted Cruz, Glenn Beck, and others at a rally denouncing the Iran deal. Which is ironic, because Trump is one reason the deal will pass.
Before Trump entered the campaign, foreign policy dominated the Republican presidential race. With Democrats less vulnerable on the economy, and the public growing more progressive on cultural issues like gay marriage, drugs, and crime, the GOP candidates refocused on America’s supposedly collapsing position in the world. As The New York Timesreported in February, “Gruesome killings by the Islamic State, terrorist attacks in Europe and tensions with President Vladimir V. Putin of Russia are reshaping the early Republican presidential race, creating anxiety among party voters and sending potential candidates scrambling to outmuscle one another on foreign policy.”
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.