Like many people (including my Atlantic colleague Jim Fallows), when I first heard about Northwest Airlines Flight 188 going radio silent for 75 minutes and overshooting its destination by 150 miles in October, I figured the pilots must have fallen asleep. As a pilot myself, I could think of no other conceivable reason for such a jaw-dropping lapse in pilot performance. And like Fallows, I, too, reacted with disbelief when the pilots said they had simply been too absorbed with company scheduling issues on their laptop computers. For over an hour?? With absolutely zero thought to where the heck are we??
In subsequent conversations with airline pilots I know, I discovered that to those who fly the line, it's not inconceivable. Just appalling. Suffice it to say that there are apparently a few other pilots out there whose sense of professionalism is noticeably and irritatingly lacking. And while pilots usually try to cut each other a little slack, especially from critiques outside the industry, I've only received one email from an airline pilot defending the actions of the Northwest crew. The rest ran along the lines of "they should be stripped of their ratings and pensions and never be allowed to fly an airplane again. Period."
The FAA agreed, revoking the pilots' certificates within days of the event. The reason all this is noteworthy again is that the pilots are currently in the process of appealing those revocations. And in statements to the FAA released Monday, they tried to shift the blame onto the air traffic controllers who failed to get in touch with them, saying that failures by the air traffic controllers that were "a causal or contributing factor in the incident."
I'm not sure which is more outrageous, actually. To get so engrossed in your personal priorities that you don't bother to ask, "gee, why is Center not calling us," or glance at any of the navigation screens that show you fast approaching your destination, or notice any of the eight separate text messages your own dispatchers have sent you, accompanied by warning lights ... in short, to not think for even one minute about actually flying the airplane ... or to try to blame it on controllers who didn't manage to yell at you loudly enough to get your distracted attention.
It is drilled into every pilot, from the earliest days of their flight training, that the pilot in command is just that: the person who holds final responsibility and accountability for the safe outcome of every flight. If you're flying in busy airspace, in clouds, or at altitudes where the airlines cruise, there are rules that say you have to be in contact with controllers and on a flight plan, at all times. If a controller says you need to do something, in most cases, you should do it. But the pilot retains final responsibility and say over the operation of the aircraft--as it should be. After all, as pilots are fond of saying, the furthest a controller can fall is the 18 inches from their chair to the floor.
If a pilot doesn't feel they can safely execute a controller's request, the simple response "unable" trumps the controller's direction. If worse comes to worst, a pilot can simply declare an emergency and do whatever is necessary to save the airplane and sort the details out on the ground. So blaming the controllers for not doing a better job at getting you to do your job is an even flimsier excuse than saying "the dog ate my homework" or "Johnny made me do it."
Controllers can make mistakes, of course, and from reading the transcripts of the air traffic control communications related to that flight, it seems as if there might have been room for improvement. Not in getting the attention of the Northwest pilots (one controller tried to contact the pilots more than a dozen times), but in realizing that a potentially serious situation, with potentially serious security concerns, was unfolding before them.
In the years since the attacks of 9/11, any number of small airplane pilots who strayed out of approved flight paths or airspace have found themselves eye to eye with pilots in military aircraft and helicopters, signaling stern orders to follow them to an airport and land NOW. This, mind you, for little training aircraft that weigh less than a Honda Civic and could probably do less damage. Yet an airliner with the fuel and mass to really do damage goes radio silent for over an hour, and cruises right past its destination, and nobody moved to intercept it--at least in part because controllers were slow to process what was going on and notify the appropriate agencies.
A mismanaged shift change in Denver may account for some of the delay. And in all fairness, the controllers after that assumed a benign explanation: that the flight had simply lost its radios and was unable to talk to anyone. So they treated it as such. And that kind of thing does happen. But the transcripts also show confusion among controllers about what was really going on and what to do about it. The same kind of confusion that the transcripts of controllers on 9/11 showed. Where are they? Are you talking to them? Can you get someone to try to reach them? Did someone call their company dispatchers?
Of course, the airliner had not departed from its flight path, or shown erratic behavior that would have raised more alarm. And enough little glitches happen in air traffic control communications that controllers are not trigger-loaded to ring alarm bells at the first sign of something amiss. But, still. The Commander of the North American Aerospace Defense Command is not pleased.
It's been said that the attacks of 9/11 succeeded due to a lack of imagination on our part. We simply couldn't conceive of hijackers using box cutters to take over airplanes and fly them into buildings. And perhaps the controllers working with Flight 188 had, thankfully, gotten so used to safe skies again that they assumed a lack of contact from an airliner meant an inoperable radio rather than imagining something more serious.
So I hope controllers are getting a refresher course on the importance of better coordination, keeping alert for anomalies, and and questioning all the possible reasons a problem might be occurring. But for the Northwest crew to blame the controllers for not preventing their own transgressions is, as Jim Fallows said of the transgressions themselves, beyond the pale.
I'm guessing that the pilots are following the advice of their lawyers, who are trying to find any and all angles out of a thin list of possibilities that might get their clients off the hook. But ever since the first officer confidently told the press that the passengers were in no danger at any time, the crew has shown an appalling lack of awareness of just how egregious their sins were. What if the flight had been intercepted, as it perhaps should have been? Not to mention multiple other hazards that come with having a flight crew so detached from what's going on in the cockpit.
Perhaps it's asking too much to expect pilots who thought so little of their professional responsibilities in the first place to step up and take professional, mature responsibility for their failures. And the idea of minimizing their professional and legal exposure and cost is surely a tempting one. But redemption doesn't come as easily as a legal victory. And it surely doesn't come from blaming someone else for your own mistakes.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
ISIS did not merely blast apart old stones—it attacked the very foundations of pluralistic society.
If the ruined ruins of Palmyra could speak, they would marvel at our shock. After all, they have been sacked before. In their mute and shattered eloquence, they spoke for centuries not only about the cultures that built them but also about the cultures that destroyed them—about the fragility of civilization itself, even when it is incarnated in stone. No designation of sanctity, by God or by UNESCO, suffices to protect the past. The past is helpless. Instead these ruins, all ruins, have had the effect of lifting the past out of history and into time. They carry the spectator away from facts and toward reveries.
In the 18th century, after the publication in London of The Ruins of Palmyra, a pioneering volume of etchings by Robert Wood, who had traveled to the Syrian desert with the rather colorful James Dawkins, a fellow antiquarian and politician, the desolation of Palmyra became a recurring symbol for ephemerality and the vanity of all human endeavors. “It is the natural and common fate of cities,” Wood dryly remarked in one of the essays in his book, “to have their memory longer preserved than their ruins.” Wood’s beautiful and meticulous prints served as inspirations for paintings, and it was in response to one of those paintings that Diderot wrote some famous pages in his great Salons of 1767: “The ideas ruins evoke in me are grand. Everything comes to nothing, everything perishes, everything passes, only the world remains, only time endures. ... Wherever I cast my glance, the objects surrounding me announce death and compel my resignation to what awaits me. What is my ephemeral existence in comparison with that of a rock being worn down, of a valley being formed, of a forest that’s dying, of these deteriorating masses suspended above my head? I see the marble of tombs crumble into powder and I don’t want to die!”
Demonizing processed food may be dooming many to obesity and disease. Could embracing the drive-thru make us all healthier?
Late last year, in a small health-food eatery called Cafe Sprouts in Oberlin, Ohio, I had what may well have been the most wholesome beverage of my life. The friendly server patiently guided me to an apple-blueberry-kale-carrot smoothie-juice combination, which she spent the next several minutes preparing, mostly by shepherding farm-fresh produce into machinery. The result was tasty, but at 300 calories (by my rough calculation) in a 16-ounce cup, it was more than my diet could regularly absorb without consequences, nor was I about to make a habit of $9 shakes, healthy or not.
Inspired by the experience nonetheless, I tried again two months later at L.A.’s Real Food Daily, a popular vegan restaurant near Hollywood. I was initially wary of a low-calorie juice made almost entirely from green vegetables, but the server assured me it was a popular treat. I like to brag that I can eat anything, and I scarf down all sorts of raw vegetables like candy, but I could stomach only about a third of this oddly foamy, bitter concoction. It smelled like lawn clippings and tasted like liquid celery. It goes for $7.95, and I waited 10 minutes for it.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
Encouraging a focus on white identity is a dangerous approach for a country in which white supremacy has been a toxic force.
Donald Trump and the disaffected white people who make up his base of support have got me thinking about race in America. “Trump presents a choice for the Republican Party about which path to follow––” Ben Domenech writes in an insightful piece at The Federalist, “a path toward a coalition that is broad, classically liberal, and consistent with the party’s history, or a path toward a coalition that is reduced to the narrow interests of identity politics for white people.”
When I was growing up in Republican Orange County during the Reagan and Bush Administrations, lots of white parents sat their kids in front of The Cosby Show, explained that black people are just like white people, and inveighed against judging anyone by the color of their skin rather than the content of their character. The approach didn’t convey the full reality of race as minorities experience it. But it represented a significant generational improvement in race relations.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Every time you shrug, you don’t need to Google, then copy, then paste.
Updated, 2:20 p.m.
All hail ¯\_(ツ)_/¯.
In its 11 strokes, the symbol encapsulates what it’s like to be an individual on the Internet. With raised arms and a half-turned smile, it exudes the melancholia, the malaise, the acceptance, and (finally) the embrace of knowing that something’s wrong on the Internet and you can’t do anything about it.
As Kyle Chayka writes in a new history of the symbol at The Awl, the meaning of the “the shruggie” is always two, if not three- or four-, fold. ¯\_(ツ)_/¯ represents nihilism, “bemused resignation,” and “a Zen-like tool to accept the chaos of universe.” It is Sisyphus in unicode. I use it at least 10 times a day.
For a long time, however, I used it with some difficulty. Unlike better-known emoticons like :) or ;), ¯\_(ツ)_/¯ borrows characters from the Japanese syllabary called katakana. That makes it a kaomoji, a Japanese emoticon; it also makes it, on Western alphabetical keyboards at least, very hard to type. But then I found a solution, and it saves me having to google “smiley sideways shrug” every time I want to quickly rail at the world’s inherent lack of meaning.