As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
A single dose of magic mushrooms can make people with severe anxiety and depression better for months, according to a landmark pair of new studies.
The doom hung like an anvil over her head. In 2012, a few years after Carol Vincent was diagnosed with non-Hodgkin lymphoma, she was waiting to see whether her cancer would progress enough to require chemotherapy or radiation. The disease had already done a number on her, inflating lymph nodes on her chin, collar bones, and groin. She battled her symptoms while running her own marketing business. To top it all off, she was going through menopause.
“Life is just pointless stress, and then you die,” she thought. “All I’m doing is sitting here waiting for all this shit to happen.”
When one day at an intersection she mulled whether it would be so bad to get hit by a car, she realized her mental health was almost as depleted as her physical state.
For nearly three decades, Fidel Castro devoted vast amounts of Cuba’s limited resources to the project of exporting his revolution to Africa, even as it stuttered at home. As leader of Cuba, Castro advocated a radical departure from the prevailing post-war liberal internationalism, premised more on the ideas of Frantz Fanon than those of Adam Smith. Decolonization seemed to offer a prime laboratory for that vision. Cuba volunteered doctors, nurses, military advisers, and troops to support what Castro and Che Guevara saw as progressive regimes—“sister countr[ies],” in Havana’s terminology—in Algeria, Eastern Congo-Kinshasa (today’s Democratic Republic of Congo), Congo-Brazzaville, Guinea-Bissau, and, later, Ethiopia.
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
This week, the U.S. president-elect spoke with the Pakistani prime minister and, according to the Pakistani government’s account of the conversation, delivered the following message: Everything is awesome. It was, arguably, the most surprising presidential phone call since George H.W. Bush got pranked by that pretend Iranian president.
Pakistan, Donald Trump reportedly told Nawaz Sharif, is a “fantastic” country full of “fantastic” people that he “would love” to visit as president. Sharif was described as “terrific.” Pakistanis “are one of the most intelligent people,” Trump allegedly added. “I am ready and willing to play any role that you want me to play to address and find solutions to the outstanding problems.”
The Daily Show host was measured, respectful, and challenging in his 26-minute conversation with TheBlaze pundit Tomi Lahren.
Tomi Lahren, the 24-year-old host of Tomi on the conservative cable network TheBlaze, feels like a pundit created by a computer algorithm, someone who primarily exists to say something provocative enough to jump to the top of a Facebook feed. She’s called the Black Lives Matter movement “the new KKK,” partly blamed the 2015 Chattanooga shootings on President Obama’s “Muslim sensitivity,” and declared Colin Kaepernick a “whiny, indulgent, attention-seeking cry-baby.” At a time when such charged political rhetoric feels increasingly like the norm, Lahren stands at one end of a widening gulf—which made her appearance on The Daily Show with Trevor Noah Wednesday night all the more fascinating.
In his first year at The Daily Show, Noah has struggled to distinguish himself in an outrage-driven late-night universe. He has sometimes seemed too flip about the failures of the country’s news media, something his predecessor Jon Stewart made a perennial target. Noah’s 26-minute conversation with Lahren, though, posted in its entirety online, set the kind of tone that Stewart frequently called for throughout his tenure. The segment never turned into a screaming match, but it also avoided platitudes and small-talk. Lahren was unapologetic about her online bombast and leaned into arguments that drew gasps and boos from Noah’s audience, but the host remained steadfastly evenhanded throughout. If Noah was looking for a specific episodethat would help him break out in his crowded field, he may have finally found it.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Editor’s note: An earlier version of this story presented an economic modeling assumption—the .01 chance of human extinction per year—as a vetted scholarly estimate. Following a correction from the Global Priorities Project, the text below has been updated.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. A new report from the U.K.-based Global Challenges Foundation urges us to take them seriously.
The nonprofit began its annual report on “global catastrophic risk” with a startling provocation: If figures often used to compute human extinction risk are correct, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?” So the supercomputer HAL pleads with the implacable astronaut Dave Bowman
in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s
2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial “ brain. “Dave, my mind is going,” HAL says, forlornly. “I can feel it. I can feel it.”
I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
Critics say she failed to energize the Democratic base. But vote totals show her biggest shortcomings were in counties that opposed Barack Obama the most.
It now seems likely that Hillary Clinton will get fewer votes than Barack Obama did in 2012. More distressingly for Democrats, she fared worse in Democratic-leaning cities that anchor swing states, including Detroit, Cleveland, and Milwaukee. To critics on the left, that’s evidence of a campaign that dragged its feet, and a candidate who took her base for granted. Her defeat, in their minds, was an unforced error.
But the numbers show something different. There’s no question Clinton faltered in some Democratic cities, but the gaps between her haul and Obama’s in those locations were modest. The vast majority of her deficit came instead from counties that Obama lost in 2012: They didn’t like him, but they really hated her.