As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
Demonizing processed food may be dooming many to obesity and disease. Could embracing the drive-thru make us all healthier?
Late last year, in a small health-food eatery called Cafe Sprouts in Oberlin, Ohio, I had what may well have been the most wholesome beverage of my life. The friendly server patiently guided me to an apple-blueberry-kale-carrot smoothie-juice combination, which she spent the next several minutes preparing, mostly by shepherding farm-fresh produce into machinery. The result was tasty, but at 300 calories (by my rough calculation) in a 16-ounce cup, it was more than my diet could regularly absorb without consequences, nor was I about to make a habit of $9 shakes, healthy or not.
Inspired by the experience nonetheless, I tried again two months later at L.A.’s Real Food Daily, a popular vegan restaurant near Hollywood. I was initially wary of a low-calorie juice made almost entirely from green vegetables, but the server assured me it was a popular treat. I like to brag that I can eat anything, and I scarf down all sorts of raw vegetables like candy, but I could stomach only about a third of this oddly foamy, bitter concoction. It smelled like lawn clippings and tasted like liquid celery. It goes for $7.95, and I waited 10 minutes for it.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
ISIS did not merely blast apart old stones—it attacked the very foundations of pluralistic society.
If the ruined ruins of Palmyra could speak, they would marvel at our shock. After all, they have been sacked before. In their mute and shattered eloquence, they spoke for centuries not only about the cultures that built them but also about the cultures that destroyed them—about the fragility of civilization itself, even when it is incarnated in stone. No designation of sanctity, by God or by UNESCO, suffices to protect the past. The past is helpless. Instead these ruins, all ruins, have had the effect of lifting the past out of history and into time. They carry the spectator away from facts and toward reveries.
In the 18th century, after the publication in London of The Ruins of Palmyra, a pioneering volume of etchings by Robert Wood, who had traveled to the Syrian desert with the rather colorful James Dawkins, a fellow antiquarian and politician, the desolation of Palmyra became a recurring symbol for ephemerality and the vanity of all human endeavors. “It is the natural and common fate of cities,” Wood drily remarked in one of the essays in his book, “to have their memory longer preserved than their ruins.” Wood’s beautiful and meticulous prints served as inspirations for paintings, and it was in response to one of those paintings that Diderot wrote some famous pages in his great Salons of 1767: “The ideas ruins evoke in me are grand. Everything comes to nothing, everything perishes, everything passes, only the world remains, only time endures. ... Wherever I cast my glance, the objects surrounding me announce death and compel my resignation to what awaits me. What is my ephemeral existence in comparison with that of a rock being worn down, of a valley being formed, of a forest that’s dying, of these deteriorating masses suspended above my head? I see the marble of tombs crumble into powder and I don’t want to die!”
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Encouraging a focus on white identity is a dangerous approach for a country in which white supremacy has been a toxic force.
Donald Trump and the disaffected white people who make up his base of support have got me thinking about race in America. “Trump presents a choice for the Republican Party about which path to follow––” Ben Domenech writes in an insightful piece at The Federalist, “a path toward a coalition that is broad, classically liberal, and consistent with the party’s history, or a path toward a coalition that is reduced to the narrow interests of identity politics for white people.”
When I was growing up in Republican Orange County during the Reagan and Bush Administrations, lots of white parents sat their kids in front of The Cosby Show, explained that black people are just like white people, and inveighed against judging anyone by the color of their skin rather than the content of their character. The approach didn’t convey the full reality of race as minorities experience it. But it represented a significant generational improvement in race relations.
Some Republican candidates are promoting a policy change that would hurt workers by disguising it with a pleasant-sounding phrase.
Americans like their Social Security benefits quite a bit: They oppose cuts to them by a margin of two to one. Even Millennials, who won’t be seeing benefits anytime soon, feel protective of Social Security, according to a poll from the Pew Research Center.
One way to effectively cut Social Security benefits is to raise the age at which they kick in. And yet, when asked specifically about raising the retirement age, Americans are mixed.
Perhaps confusion arises because “raising the age of retirement” sounds like a nice jobs program for older Americans, or an end to forced retirement. I sympathize with that position: Anyone who wants to retire later and work into old age should have a job. But that’s not what raising the retirement age would entail—the fact is, raising the Social Security retirement age represents a reduction in benefits: Because the monthly payments a person receives grow bigger the later in life he or she retires, raising the age cutoff reduces the total amount of money paid out.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.