As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
“Don’t underestimate me,” declared newly announced presidential candidate Bernie Sanders to George Stephanopoulos on Sunday. That may be good advice.
By conventional standards, Sanders’s candidacy is absurd: He’s not well known, he doesn’t have big money donors, he’s not charismatic, and by Beltway standards, he’s ideologically extreme. But candidates with these liabilities have caught fire before. Think of Jerry Brown, who despite little funding and an oddball reputation outlasted a series of more conventional candidates to emerge as Bill Clinton’s most serious challenger in 1992. Or Pat Buchanan, who struck terror in the GOP establishment by winning the New Hampshire primary in 1996. Or Howard Dean, who began 2003 in obscurity and ended it as the Democratic frontrunner (before collapsing in the run-up to the Iowa Caucuses). Or Ron Paul, who in 2012 finished second in New Hampshire and came within three points of winning Iowa.
Two years ago, a Dutch creative agency opened a concept restaurant in Amsterdam that would be, in the words of its founder, “the perfect place to dine in pleasant solitude.” The restaurant is called Eenmaal—this name has been translated into English as “dinner for one”—and was launched in an attempt to start dissolving the stigma attached to going out alone. Apparently picking up on the same cultural drift, a new fast-casual restaurant in Washington, D.C., has tiered, bench-like seating with individual trays, an arrangement that caters to solo diners.
As antisocial as those ideas may sound, it’s surprising that the world hasn’t seen more of them. Today, more than a quarter of American households are home to just one person—a figure that has tripled since 1970. Also, the median age at which Americans get married has recently reached a record high. Given these demographic shifts, one would think that by now, going out to the movies or to dinner alone wouldn’t be the radical acts they still are.
Marilyn Mosby's press conference Friday shocked residents of Baltimore and everyone else watching protests over Freddie Gray's death. Barely 24 hours after police had completed their investigation into the death of the 25-year-old black man in police custody, the Baltimore City state's attorney announced a strong slate of charges against the six officers involved. It wasn't just the speed (Mosby said her office had begun investigations the day after Gray's arrest, and six days before his death) but the charges: second-degree depraved-heart murder against one officer, with the others facing a mix of manslaughter, assault, misconduct, and false imprisonment.
The decision was met with jubilation in West Baltimore, where protestors had rioted just four nights before. But almost immediately, critics began to second-guess Mosby, who's been on the job for just a few months. Were her charges politically motivated, or perhaps calculated to calm protests? Had she overcharged the officers, picking unfair charges, or ones she couldn't win? Did she move too fast to charge the officers?
Two recent events—the spectacle of Garry Trudeau, the Doonesbury creator, attacking a group of murdered cartoonists for offending his sensibilities, and the protest organized by a group of bien-pensant writers against the PEN American Center for planning to honor those cartoonists tonight in New York—have brought the Charlie Hebdo controversy back to public consciousness. So has the failed attack Sunday in Texas on a group of anti-Islam militants staging a Prophet Muhammad cartoon contest, though, unlike Charlie Hebdo, the organization that sponsored the Texas event is run by an actual anti-Muslim extremist who, I'm proud to say, is a personal nemesis of mine.
Much has already been written about both the Trudeau and PEN controversies. I particularly recommend David Frum on Trudeau, and Katha Pollitt and Matt Welch on PEN, as well as this fine op-ed by Andrew Solomon and Suzanne Nossel, the president and executive director, respectively, of the PEN American Center. These represent only a handful of the many dozens of writers who have risen in defense of free speech, and of Charlie Hebdo’s right to lampoon religion.
The simplest way to reduce the number of Americans who are abused by police officers is not to retrain cops or to reform their subculture. It is to significantly reduce the number of adversarial interactions people have with police.
Questions about how frequently Americans ought to interact with law enforcement are often associated with the debate over Broken Windows theory. Its proponents champion a model of policing where foot patrolmen are a regular presence in high-crime neighborhoods, vigilantly guarding against the sorts of low-level disorder that ostensibly leads to more serious crime if left unchecked.
For now, let's defer debate about Broken Windows theory.
Even if it is correct, there are still a number of reforms that would reduce adversarial contacts with police officers without increasing disorder on the streets.
If you had to place yourself in a socioeconomic class, where would you land? That’s a tricky and personal question for most Americans. Education, income, and even parental wealth can all factor into class status, but the borders of each group can still be hard to parse. That’s because socioeconomic class structure in the U.S. is a nebulous thing that can be as much about perception and comparison as it is about measurable metrics, like money.
One of the more common methods for identifying the "middle class" is to simply define it as the half of the population making more than the bottom quarter and less than the top quarter. In 2013, such rankings would consider households with income between about $24,000 to $90,000 middle class, based on data from the Survey of Consumer Finances. With a more comprehensive wealth measure—taking into consideration not only income, but total assets and liabilities—this middle 50 percent of Americans covers an enormous range: families who have anywhere between about $9,000 to $317,000. Which is pretty crazy given the vastly different realities of families on either end of those spectrums.
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
When I was a teenager, I wished for many things. I was determined to be a historian like my intellectual idol, A.J.P. Taylor, whose television lectures on British and European history held me spellbound. I wanted to lead a political party and deliver speeches to adoring supporters. These were big dreams for a working-class kid from Glasgow whose family had never sent anyone to university.
And yet the dreams somehow came true. I went to Oxford for my doctorate and even got Alan Taylor as my supervisor, before joining the faculty of the London School of Economics. I also established a political party, UKIP, whose goal was to halt the European Union’s encroachments on British democracy and whose fortunes now constitute one of the major storylines of Britain’s general election on Thursday.
In the 2001 movie Donnie Darko, a group of teenage boys are drinking and shooting guns when the topic of conversation turns, as you might expect, to the topic of women. “We gotta find ourselves a Smurfette,” one of them says.
“Smurfette?” his friend asks.
“Mm-hmm. Not some, like, tight-ass Middlesex chick, you know? Like, this cute little blonde that will get down and dirty with the guys. Like Smurfette does.”
“That's bullshit. Smurfette fucks all the other Smurfs.”
This exchange came to mind when watching the actor Jeremy Renner's appearance on Conan this week, during which he called Black Widow, the Avenger played by Scarlett Johansson, “a slut.” He'd already made a joke along these lines a few weeks ago, after which he apologized to anyone who'd been offended. But apparently Renner believed his joke wasn't actually vile—just misunderstood. “Conan, if you slept with four of the six Avengers, no matter how much fun you had, you’d be a slut,” he said. “I’d be a slut.”
The man from Hope is back. Nope, not that one—the one whose wife is leading the Democratic field. The one who succeeded him as governor of Arkansas: Republican Mike Huckabee.
Huckabee is announcing Tuesday that he's a candidate for president with a kickoff in the hometown he shares with Bill Clinton. After a strong run in 2008 and a decision to take the 2012 cycle off, Huckabee is testing whether he still has the same pull he once did.
He's the third Republican candidate to announce this week alone, and the fourth in 10 days. On Monday, neurosurgeon Ben Carson and tech executive Carly Fiorina both announced campaigns, and last week Senator Bernie Sanders announced he was seeking the Democratic nomination.