As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
Some fans are complaining that Zack Snyder’s envisioning of the Man of Steel is too grim—but it’s less a departure than a return to the superhero’s roots.
Since the official teaser trailer for Batman v Superman: Dawn of Justice debuted online in April, fans and critics alike have been discussing the kind of Superman Zack Snyder is going to depict in his Man of Steel sequel. The controversy stems from Snyder’s decision to cast Superman as a brooding, Dark Knight-like character, who cares more about beating up bad guys than saving people. The casting split has proved divisive among Superman fans: Some love the new incarnation, citing him as an edgier, more realistic version of the character.
But Snyder’s is a different Superman than the one fans grew up with, and many have no problem expressing their outrage over it. Even Mark Waid, the author of Superman: Birthright (one of the comics the original film is based on), voiced his concern about Man of Steel’s turn toward bleakness when it came out in 2013:
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
In an interview, the U.S. president ties his legacy to a pact with Tehran, argues ISIS is not winning, warns Saudi Arabia not to pursue a nuclear-weapons program, and anguishes about Israel.
On Tuesday afternoon, as President Obama was bringing an occasionally contentious but often illuminating hour-long conversation about the Middle East to an end, I brought up a persistent worry. “A majority of American Jews want to support the Iran deal,” I said, “but a lot of people are anxiety-ridden about this, as am I.” Like many Jews—and also, by the way, many non-Jews—I believe that it is prudent to keep nuclear weapons out of the hands of anti-Semitic regimes. Obama, who earlier in the discussion had explicitly labeled the supreme leader of Iran, Ayatollah Ali Khamenei, an anti-Semite, responded with an argument I had not heard him make before.
“Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said, referring to the apparently almost-finished nuclear agreement between Iran and a group of world powers led by the United States. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
The brilliant mathematician, who died in a car accident on Sunday, was best known for his struggle with mental illness.
John Nash, a Nobel laureate and mathematical genius whose struggle with mental illness was documented in the Oscar-winning film A Beautiful Mind, was killed in a car accident on Saturday. He was 86. The accident, which occurred when the taxi Nash was traveling in collided with another car on the New Jersey Turnpike, also claimed the life of his 82-year-old wife, Alicia. Neither of the two drivers involved in the accident sustained life-threatening injuries.
Born in West Virginia in 1928, Nash displayed an acuity for mathematics early in life, independently proving Fermat’s little theorem before graduating from high school. By the time he turned 30 in 1958, he was a bona fide academic celebrity. At Princeton, Nash published a 27-page thesis that upended the field of game theory and led to applications in economics, international politics, and evolutionary biology. His signature solution—known as a “Nash Equilibrium”—found that competition among two opponents is not necessarily governed by zero-sum logic. Two opponents can, for instance, each achieve their maximum objectives through cooperating with the other, or gain nothing at all by refusing to cooperate. This intuitive, deceptively simple understanding is now regarded as one of the most important social science ideas in the 20th century, and a testament to his almost singular intellectual gifts.
Advocates say that a guaranteed basic income can lead to more creative, fulfilling work. The question is how to fund it.
Scott Santens has been thinking a lot about fish lately. Specifically, he’s been reflecting on the aphorism, “If you give a man a fish, he eats for a day. If you teach a man to fish, he eats for life.” What Santens wants to know is this: “If you build a robot to fish, do all men starve, or do all men eat?”
Santens is 37 years old, and he’s a leader in the basic income movement—a worldwide network of thousands of advocates (26,000 on Reddit alone) who believe that governments should provide every citizen with a monthly stipend big enough to cover life’s basic necessities. The idea of a basic income has been around for decades, and it once drew support from leaders as different as Martin Luther King Jr. and Richard Nixon. But rather than waiting for governments to act, Santens has started crowdfunding his own basic income of $1,000 per month. He’s nearly halfway to his his goal.
19 Kids and Counting built its reputation on preaching family values, but the mass-media platforms that made the family famous might also be their undoing.
On Thursday, news broke that Josh Duggar, the oldest son of the Duggar family's 19 children, had, as a teenager, allegedly molested five underage girls. Four of them, allegedly, were his sisters.
The information came to light because, in 2006—two years before 17 Kids and Counting first aired on TLC, and thus two years before the Duggars became reality-TV celebrities—the family recorded an appearance on TheOprah Winfrey Show. Before the taping, an anonymous source sent an email to Harpo warning the production company Josh’s alleged molestation. Harpo forwarded the email to authorities, triggering a police investigation (the Oprah appearance never aired). The news was reported this week by In Touch Weekly—after the magazine filed a Freedom of Information Act request to see the police report on the case—and then confirmed by the Duggars in a statement posted on Facebook.
Why agriculture may someday take place in towers, not fields
A couple of Octobers ago, I found myself standing on a 5,000-acre cotton crop in the outskirts of Lubbock, Texas, shoulder-to-shoulder with a third-generation cotton farmer. He swept his arm across the flat, brown horizon of his field, which was at that moment being plowed by an industrial-sized picker—a toothy machine as tall as a house and operated by one man. The picker’s yields were being dropped into a giant pod to be delivered late that night to the local gin. And far beneath our feet, the Ogallala aquifer dwindled away at its frighteningly swift pace. When asked about this, the farmer spoke of reverse osmosis—the process of desalinating water—which he seemed to put his faith in, and which kept him unafraid of famine and permanent drought.
In any case, people have probably heard the phrase in reference to something gone awry at work or in life. In either setting, when the shit does hit the fan, people will tend to look to the most competent person in the room to take over.
And too bad for that person. A new paper by a team of researchers from Duke University, University of Georgia, and University of Colorado looks at not only how extremely competent people are treated by their co-workers and peers, but how those people feel when, at crucial moments, everyone turns to them. They find that responsible employees are not terribly pleased about this dynamic either.