As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
The paper of record’s inaccurate reporting on a nonexistent criminal investigation was a failure that should entail more serious consequences.
I have read The New York Times since I was a teenager as the newspaper to be trusted, the paper of record, the definitive account. But the huge embarrassment over the story claiming a criminal investigation of Hillary Clinton for her emails—leading the webpage, prominent on the front page, before being corrected in the usual, cringeworthy fashion of journalists who stonewall any alleged errors and then downplay the real ones—is a direct challenge to its fundamental credibility. And the paper’s response since the initial huge error was uncovered has not been adequate or acceptable.
This is not some minor mistake. Stories, once published, take on a life of their own. If they reinforce existing views or stereotypes, they fit perfectly into Mark Twain’s observation, “A lie can travel halfway around the world while the truth is putting on its shoes.” (Or perhaps Twain never said it, in which case the ubiquity of that attribution serves to validate the point.) And a distorted and inaccurate story about a prominent political figure running for president is especially damaging and unconscionable.
A newly discovered artifact buried with one of Jamestown’s most prominent leaders suggests he could have been a crypto-Catholic.
After 400 years in the Virginia dirt, the box came out of the ground looking like it had been plucked from the ocean. A tiny silver brick, now encrusted with a green patina and rough as sandpaper. Buried beneath it was a human skeleton. The remains would later be identified as those of Captain Gabriel Archer, one of the most prominent leaders at Jamestown, the first permanent English colony in America. But it was the box, which appeared to be an ancient Catholic reliquary, that had archaeologists bewildered and astonished.
“One of the major surprises was the discovery of this mysterious small silver box,” said James Horn, the president of the Jamestown Rediscovery Foundation. “I have to say, we’re still trying to figure this out. You have the very strange situation of a Catholic reliquary being found with the leader of the first Protestant church in the country.”
Has the Obama administration’s pursuit of new beginnings blinded it to enduring enmities?
“The president said many times he’s willing to step out of the rut of history.” In this way Ben Rhodes of the White House, who over the years has broken new ground in the grandiosity of presidential apologetics, described the courage of Barack Obama in concluding the Joint Comprehensive Plan of Action with the Islamic Republic of Iran, otherwise known as the Iran deal. Once again Rhodes has, perhaps inadvertently, exposed the president’s premises more clearly than the president likes to do. The rut of history: It is a phrase worth pondering. It expresses a deep scorn for the past, a zeal for newness and rupture, an arrogance about old struggles and old accomplishments, a hastiness with inherited precedents and circumstances, a superstition about the magical powers of the present. It expresses also a generational view of history, which, like the view of history in terms of decades and centuries, is one of the shallowest views of all.
The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?
When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.
Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.
The agreement doesn’t guarantee that Tehran will never produce nuclear weapons—because no agreement could do so.
A week ago I volunteered my way into an Atlantic debate on the merits of the Iran nuclear agreement. The long version of the post is here; the summary is that the administration has both specific facts and longer-term historic patterns on its side in recommending the deal.
On the factual front, I argued that opponents had not then (and have not now) met President Obama’s challenge to propose a better real-world alternative to the negotiated terms. Better means one that would make it less attractive for Iran to pursue a bomb, over a longer period of time. Real world means not the standard “Obama should have been tougher” carping but a specific demand that the other countries on “our” side, notably including Russia and China, would have joined in insisting on, and that the Iranians would have accepted.
Orr: “Sometimes a thing happens. Splits your life. There’s a before and after. I got like five of them at this point.”
This was Frank offering a pep talk to the son of his murdered former henchman Stan in tonight’s episode. (More on this in a moment.) But it’s also a line that captures this season of True Detective so perfectly that it almost seems like a form of subliminal self-critique.
Remember when Ray got shot in episode two and appeared to be dead but came back with a renewed sense of purpose and stopped drinking. No? That’s okay. Neither does the show: It was essentially forgotten after the subsequent episode. Remember when half a dozen (or more) Vinci cops were killed in a bloody shootout along with dozen(s?) of civilians? No? Fine: True Detective’s left that behind, too. Unless I missed it, there was not a single mention of this nationally historic bloodbath tonight.
This is the third in a series. Readers are invited to send their own responses to firstname.lastname@example.org, and we will post their strongest critiques of the book and the accompanied reviews. (The first batch is here.) To further encourage civil and substantive responses via email, we are closing the comments section. You can follow the whole series on Twitter at #BTWAM and read all of the responses to the book from Atlantic readers and contributors.
Several years ago, Ta-Nehisi Coates took his son, not yet 5, to see a movie on the Upper West Side of Manhattan. As his son made his way off the escalator, a white woman pushed him and said, “Come on!” Chaos ensued. There was a black parent’s rage and a white man’s threat to have the black parent arrested. Coates narrates the incident in cool, steady prose. Ultimately, he writes of the regret he carries: “In seeking to defend you I was, in fact, endangering you.”
Readers discuss Ta-Nehisi Coates's bestseller. Is it too bleak? Does it convey any hope for race relations? Is that even the point?
Thus far, The Atlantic has posted three essays on Between the World and Me, from Michael Eric Dyson, James Forman Jr., and Tressie McMillan Cottom, all of them uncritical. Among the reader responses so far, the strongest critique comes from Melvin Rogers, a professor of African American Studies and Political Science at U.C.L.A. Rogers emailed an eloquent seven-page review, but below is a shorter edited version, posted with permission:
Between The World and Me is an exquisite book, overflowing with insights about the embodied state of blackness and the logic of white supremacy. Coates’s prose is capable of challenging our understanding of the United States even as it captures our hearts. I plan to teach the book for two of my courses this academic year.
But for all of the beauty and power of the book, it is also profoundly troubling. The wound of racism is too fresh; the sharpness of the pain captures Coates’s senses and arrests his imagination. The worry is that if we follow along, we, too, shall be captured.
The book initially seems like it will reveal the illusion of the Dream and then open up the possibility for imagining the United States anew. But Coates does not move in that direction. He rejects the American mythos but also embraces the certainty of white supremacy and its inescapable constraints. For him, white supremacy is not merely a historically emergent feature of the United States; it is an ontology. White supremacy, in other words, does not structure reality; it is reality.
There’s a danger there. When one conceptualizes white supremacy at the level of ontology, there is little room for one’s imagination to soar, and one’s sense of agency is inescapably constrained. Action is tied fundamentally to what we imagine is possible for us, but there can be no affirmative politics when race functions as a wounded attachment.
What about all those young men and women in the streets of Ferguson, Chicago, New York, and Charleston—how should we read their efforts? Coates’s answer seems to appear in one of the pivotal and tragic moments of the book—the murder of a college friend, Prince Jones, at the hands of the police:
[N]o one would be brought to account for this destruction... The earthquake cannot be subpoenaed. The typhoon will not bend under indictment. They sent the killer of Prince Jones back to his work, because he was not a killer at all. He was a force of nature, the helpless agent of our world’s physical laws.
But if we are all just helpless agents of physical laws, the question again emerges: What does one do? Coates recommends interrogation and struggle. His love for books and his journey to Howard University—“Mecca,” as he calls it—serve to question the world around him. But interrogation and struggle to what end?
“It is truly horrible,” Coates writes in one of the most disturbing sentences of the book, “to understand yourself as the essential below of your country.” Herein lies the danger: Forget telling his son it will be okay; Coates cannot even tell him it may be okay. “The struggle is really all I have for you,” he tells his son, “because it is the only portion of this world under your control.” What a strange form of control. Black folks may control their place in the battle, but never with the possibility that they, and in turn their country, may win.
Releasing the book at this moment—given all that is going on with black lives under public assault—seems the oddest thing to do. For all of the channeling of James Baldwin, Coates seems to have forgotten that black folks “can’t afford despair”:
The reason why you can’t say there isn’t hope is not because you are living in a dream or selling a fantasy, but because there can be no certain knowledge of the future. Humility, borne of our ignorance of the future, justifies hope.
Much has been made of the comparison between Baldwin and Coates, owing to how the book is structured and because of Toni Morrison’s endorsement. But what this connection means escapes many commentators. In Notes of a Native Son, Baldwin reflects on the wounds that white supremacy left on his father:
When he died, I had been away from home for a little over a year. In that year I had had time to become aware of the meaning of all my father's bitter warnings, had discovered the secret of his proudly pursed lips and rigid carriage: I had discovered the weight of white people in the world. I saw that this had been for my ancestors and now would be for me an awful thing to live with and that the bitterness which had helped to kill my father could also kill me.
Similar to Coates, Baldwin’s father was wounded and so was Baldwin. Yet Baldwin knew that wounded attachment would destroy not the plunderers of black life but the ones who were plundered. “Hatred, which could destroy so much, never failed to destroy the man who hated and this was an immutable law.” Baldwin’s father, as he understood him, was destroyed by hatred.
So Coates is less like Baldwin in this respect and, perhaps, more like Baldwin’s father. “I am wounded,” writes Coates. “I am marked by old codes, which shielded me in one world and then chained me in the next.” The chains reach out to imprison not only his son, but you and me as well.
Lastly, given the power of the book and its blockbuster success, Coates seems unable to linger on the conditions that gave life to the Ta-Nehisi Coates who now occupies the public stage. His own engagement with the world—his very agency—received social support. Throughout his book he recounts the rich diversity of black beauty and empowerment, especially at Howard. His father, William Paul Coates, is the founder of Black Classic Press, which focuses on the richness of black life. His mother, Cheryl Waters, financially support the family and provided young Coates with direction, especially with writing at an early age. And yet the adult Coates seems to stand at a distance from the condition of possibility suggested by those examples.
Black life in America is at once informed by, but not reducible to, the pain exacted on our bodies by this country. This eludes Coates. The wound is so intense he cannot direct his senses beyond the pain.
A new lawsuit filed against Conan O’Brien underlines the murky territory that comes with accusing another comedian of plagiarism.
On July 25, a Twitter user noticed something strange. The site has long suffered from joke theft—usually involving spam accounts that copy and paste other user’s popular tweets in a cheap bid for attention. But it seemed like, for the first time, one comedian’s stolen jokes were being quietly deleted and replaced by a statement from Twitter referring to “a report from the copyright holder.” After years of flagrant theft, the new policy was heartening. But that the network took so long to acknowledge that a verbatim retelling of a joke by another user counted as copyright infringement points to the issue’s larger nebulousness, which has plagued comedy for decades. If the wording of two jokes aren’t exactly alike, how can you tell one was stolen?