As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
Hillary Clinton and Donald Trump prepare for the final sprint to Election Day.
It’s Thursday, October 27—the election is now less than two weeks away. Hillary Clinton holds a lead against Donald Trump, according to RealClearPolitics’ polling average. We’ll bring you the latest updates from the trail as events unfold. Also see our continuing coverage:
I generally enjoy milk chocolate, for basic reasons of flavor and texture. For roughly the same reasons, I generally do not enjoy dark chocolate. *
Those are just my boring preferences, but preferences, really, won’t do: This is an age in which even the simplest element of taste will become a matter of partisanship and identity and social-Darwinian hierarchy; in which all things must be argued and then ranked; in which even the word “basic” has come to suggest searing moral judgment. So IPAs are not just extra-hoppy beers, but also declarations of masculinity and “palatal machismo.” The colors you see in the dress are not the result of light playing upon the human eye, but rather of deep epistemological divides among the world’s many eye-owners. Cake versus pie, boxers versus briefs, Democrat versus Republican, pea guac versus actual guac, are hot dogs sandwiches … It is the best of times, it is the RAGING DUMPSTER FIRE of times.
Even as the Republican launches a purported African American outreach campaign 12 days before the election, his aides say their goal is to depress turnout in the bloc.
It would be unfair to call Donald Trump’s interaction with black voters a love-hate relationship, since there’s little evidence of African American enthusiasm for Trump. But the Republican campaign has pursued a Janus-like strategy on black voters—ostensibly courting them in public while privately seeking to depress turnout.
This tension is on display in the last 24 hours. On Wednesday, Trump delivered a speech in Charlotte, North Carolina, advertised as an “urban renewal agenda for America’s inner cities.” Trump told the audience, “It is my highest and greatest hope that the Republican Party can be the home in the future and forevermore for African Americans and the African American vote because I will produce, and I will get others to produce, and we know for a fact it doesn’t work with the Democrats and it certainly doesn’t work with Hillary.”
Services like Tinder and Hinge are no longer shiny new toys, and some users are starting to find them more frustrating than fun.
“Apocalypse” seems like a bit much. I thought that last fall when Vanity Fair titled Nancy Jo Sales’s article on dating apps “Tinder and the Dawn of the ‘Dating Apocalypse’” and I thought it again this month when Hinge, another dating app, advertised its relaunch with a site called “thedatingapocalypse.com,” borrowing the phrase from Sales’s article, which apparently caused the company shame and was partially responsible for their effort to become, as they put it, a “relationship app.”
Despite the difficulties of modern dating, if there is an imminent apocalypse, I believe it will be spurred by something else. I don’t believe technology has distracted us from real human connection. I don’t believe hookup culture has infected our brains and turned us into soulless sex-hungry swipe monsters. And yet. It doesn’t do to pretend that dating in the app era hasn’t changed.
A century ago, widely circulated images and cartoons helped drive the debate about whether women should have the right to vote.
It seems almost farcical that the 2016 presidential campaign has become a referendum on misogyny at a moment when the United States is poised to elect its first woman president.
Not that this is surprising, exactly.
There’s a long tradition of politics clashing spectacularly with perceived gender norms around election time, and the stakes often seem highest when women are about to make history.
Today’s political dialogue—which often merely consists of opposing sides shouting over one another—echoes another contentious era in American politics, when women fought for the right to vote. Then and now, a mix of political tension and new-fangled publishing technology produced an environment ripe for creating and distributing political imagery. The meme-ification of women’s roles in society—in civic life and at home—has been central to an advocacy tradition that far precedes slogans like, “Life’s a bitch, don’t elect one,” or “A woman’s place is in the White House.”
Political, social, and demographic forces in the battleground of North Carolina promise a reckoning with its Jim Crow past.
In 1901, America was ascendant. Its victory over Spain, the reunification of North and South, and the closing of the frontier announced the American century. Americans awaited the inauguration of the 57th Congress, the first elected in the 20th century. All the incoming members of Congress, like those they replaced, were white men, save one.
Representative George Henry White did not climb the steps of Capitol Hill on the morning of January 29 to share in triumph. The last black congressman elected before the era of Jim Crow, White, a Republican, took the House floor in defeat. He had lost his North Carolina home district after a state constitutional amendment disenfranchised black voters—most of his constituents. That law marked the end of black political power in North Carolina for nearly a century.
The best treatment for obsessive-compulsive disorder forces sufferers to confront their fears. But for many patients, the treatment is far out of reach.
Some days, Molly C.’s brain insists she can’t wear her work shirt. She realizes this is irrational; a uniform is required for her job at a hardware store. Nevertheless, she’s addled by an eerie feeling—like, “If you wear this shirt, something bad will happen today.” Usually she can cope, but a few times she couldn’t override it, and she called in sick.
She can’t resist picking up litter whenever she spots it; the other day she cleaned up the entire parking lot of her apartment complex. Each night, she must place her phone in an exact spot on the nightstand in order to fall asleep. What’s more, she’s besieged by troubling thoughts she can’t stop dwelling on. (She asked us not to use her last name in order to protect her privacy.)
A society that glorifies metrics leaves little room for human imperfections.
A century ago, a man named Frederick Winslow Taylor changed the way workers work. In his book The Principles of Scientific Management, Taylor made the case that companies needed to be pragmatic and methodical in their efforts to boost productivity. By observing employees’ performance and whittling down the time and effort involved in doing each task, he argued, management could ensure that their workers shoveled ore, inspected bicycle bearings, and did other sorts of “crude and elementary” work as efficiently as possible. “Soldiering”—a common term in the day for the manual laborer’s loafing—would no longer be possible under the rigors of the new system, Taylor wrote.
The principles of data-driven planning first laid out by Taylor—whom the management guru Peter Drucker once called the “Isaac Newton … of the science of work”—have transformed the modern workplace, as managers have followed his approach of assessing and adopting new processes that squeeze greater amounts of productive labor from their employees. And as the metrics have become more precise in their detail, their focus has shifted beyond the tasks themselves and onto the workers doing those tasks, evaluating a broad range of their qualities (including their personality traits) and tying corporate carrots and sticks—hires, promotions, terminations—to those ratings.
The FX series’ sixth season is staging mockumentaries within mockumentaries in its exploration of reality, perception, and gore.
American Horror Story has turned into something genuinely original and inspired this season, though it’s no simple matter to explain why. FX made a big deal about not revealing ahead of time what the sixth outing of Ryan Murphy and Brad Falchuk’s slasher anthology would be about; it’s now clear this was less a marketing stunt than a way of dealing with the fact that the season doesn’t lend itself to capsule descriptions—and is exceedingly reliant on surprise. I’m now going to spoil the surprise. If you’ve not kept up out of a disinterest in butcher-knife gore and coerced cannibalism but do want to know about one of the most interesting experiments in TV right now, read ahead.
The first five episodes presented themselves as a true-crime documentary series called My Roanoke Nightmare, which had a married couple, Matt and Shelby (André Holland and Lily Rabe), testifying about moving to a house in rural North Carolina that turned out to be haunted by the spirits of colonists who vanished in the 16th century. Actors re-enacted Matt and Shelby’s story, with the people we recognize as Cuba Gooding Jr. and Sarah Paulson playing the couple, Angela Bassett playing Matt’s sister, Lee (whose “real” version is played by Adina Porter), and Kathy Bates, Evan Peters, Lady Gaga, and others as supernatural figures. The result were a straightforwardly harrowing tale of surviving ghosts and witches and evil hillbillies. Because Matt, Shelby, and Lea were giving interviews, a safety net underlay the action: Viewers knew who was going to survive. But by the time the characters escaped their perils at the end of episode five, the documentary gimmick had started to feel one-note. Where could the show go from there?
Electroconvulsive therapy is far more beneficial—and banal—than its torturous reputation suggests.
For a boy who needs routine, this day is off to a bad start. It’s early, just before 8 a.m., and unseasonably warm for June. Kyle, 17, has been up since 6:20 a.m., which isn’t all that unusual. But already, enough has happened to throw him off balance. His mother has driven him to Johns Hopkins University, in Baltimore, as she does every week. But today she is wearing makeup and fancy clothes rather than her usual exercise gear. When they get to the hospital, the hallway is not empty as it usually is, and his mother walks away from him to talk to someone else.