JFK challenged Americans to take to the skies half a century ago -- but as human space flight embraced rockets rather than reusable spacecraft, what did we lose?
Fifty years ago, on May 25, 1961, President John F. Kennedy stood before Congress and laid out his famous challenge for the nation to "commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to earth."
It was a lofty goal that set in motion the intense technology development of the Apollo era, and a moment we remember happily because, after all, we succeeded! Against all odds, Neil Armstrong and Buzz Aldrin set foot on the lunar surface on July 20, 1969, a full five months before the challenge deadline.
Achieving that success took a tremendous investment and focus of money and national resources, of course--an investment that was available because, as Kennedy made clear in his speech, going to the moon was not just an interesting scientific endeavor.
"If we are to win the battle that is now going on around the world between freedom and tyranny, the dramatic achievements in space which occurred in recent weeks [on May 5, 1961, Alan Shepard had become the first American in space] should have made clear to us all, as did the Sputnik in 1957, the impact of this adventure on the minds of men everywhere, who are attempting to make a determination of which road they should take," Kennedy said, stressing that taking a "clearly leading role" in space might even "hold the key to our future on earth."
Why the moon? Because, Kennedy said, "no single space project in this period will be more impressive to mankind."
Kennedy was undoubtedly correct in that assessment. Furthering knowledge and understanding about the universe by increments is not nearly as inspiring a goal or as strong a competitive political masterstroke as "land a man on the moon, in this decade, and return him safely to earth." A moon mission has imagination, a clear victory point--and, as retired astronaut Story Musgrave likes to point out, all the elements of great project management: a clear focus, clear requirements, a clear goal, and a clear timeline in which to accomplish that goal.
The eight-year Apollo effort leading to the moon landing also sparked the development of all kinds of new technology: from rockets to life-support systems, from lightweight materials to protective coatings, and to really cool pens that wrote in zero gravity. It also undoubtedly inspired many school children in the 1960s to pursue engineering, in the hopes of becoming part of the grand space adventure when they grew up.
But while the moon landing was unquestionably inspirational--I still remember racing home from a camping trip to watch it on TV--and a decisive public-relations victory for the U.S. in its "space war" with the Soviet Union, it came at a price. In the late 1950s, NASA was working on other, more sophisticated ways of getting into space. The X-15 rocket plane (pictured below) incorporated exotic materials, the first throttle-controlled rocket engine and was designed to fly more than six times the speed of sound, at altitudes above 250,000 feet. Up at those altitudes, it used small bursts by hydrogen-peroxide thrust rockets for control (normal aircraft control surfaces, which depend on air pressure, would be useless outside the atmosphere) and then glided back for an unpowered landing on earth.
And yes, that's Neil Armstrong in that photo--Armstrong served as an X-15 test pilot before joining the astronaut corps.
The military was also working on a space plane project called Dyna-Soar, while other researchers at NASA worked on concepts for lifting bodies--highly efficient, if odd-shaped, spacecraft that could handle the heat of re-entry while still being controllable within the atmosphere. (see the examples below)
There was, in fact, a division within NASA between the "airplane" folks, who wanted to develop more sophisticated, reusable spacecraft that could fly into space and back, and the "rocket" folks who advocated the brute force of a rocket launcher with a capsule on top as the best (and fastest) way to get space capability. But with the tight deadline imposed by Kennedy's challenge to getting a man to the moon and back within nine years, it became clear that the more complex reusable engines and spacecraft would take too long to develop. The rockets won the day, and the funding and focus turned away from hypersonic space vehicles and space "flight."
The Space Shuttle did, in fact, incorporate some of the earlier design concepts from the airplane side of NASA--including its reliance on gliding back to an unpowered landing on earth. But concepts like a single-stage-to-orbit rocket engine, scram and ram jets for ultra-high-speed transport planes, and better reusable spacecraft designs never made it off the drawing board. If they had, we might now have commercial spaceflight vehicles hopping from Japan to Chicago on a regular basis. As it is, even the Shuttle has to rely on the brute force of disposable rocket engines to get out of the earth's atmosphere, at a cost of around half a billion dollars a pop.
The moon program also seemed to lock our collective imagination into a fixed formula for human spaceflight, and spaceflight as an engineering project, even if those missions had questionable scientific value (with notable exceptions like the launch and repair of the Hubble Space Telescope). After all, even the moon mission was primarily an engineering challenge, not a scientific research mission.
As Story Musgrave put it in the interview noted above,
We could have had multiple Voyagers landed or floating in the atmosphere on every planet and on every moon of every planet. That is what we gave up when we went with [the International Space Station]. If you sent multi-media robotic machines [into space], people would be unbelievably excited about going everywhere out there. And we could have gone everywhere. But we opted to stay in low-earth orbit and do a jobs program because we had no imagination.
Musgrave is not the only one of that opinion. John M. Logsdon, a space policy specialist who's written a new book on the subject (John F. Kennedy and the Race to the Moon), told a New York Times writer last week that despite having praised the Apollo program in an earlier book, he's since come to the conclusion that the Apollo program's impact on the space program has "on balance, been negative." Apollo, Logsdon said, was "a dead-end undertaking in terms of human travel beyond the immediate vicinity of this planet."
Certainly the human space flight program, and the International Space Station, have more than a few critics. And the money and focus on the human spaceflight side of NASA have deflected huge amounts of money and brainpower away from other research efforts. The question is ... could the situation have been different?
I'm a huge fan of the more sophisticated design ideas that languished at NASA in the post-Kennedy-challenge era, as well as many of the other technologies that could have been developed with that money. Not to mention the scientific discoveries we could have made if we'd put the effort there instead of sending crew after crew into the same orbit around the earth. The materials and mind-bending physics know-how required to build a spacecraft capable of really-distant space flight outside our galaxy still lie beyond our reach. But we might be closer if we'd put a big chunk of the human space flight budget toward that effort.
On the other hand, the prodigious Apollo funding would likely not have been approved for anything less clear, less politically impactful or less mesmerizing than putting a human on the moon. So in many ways, whether or not the Apollo money could have been better spent is a moot point. And there is something to be said--something pretty compelling--for having gotten a human off the planet, onto another celestial body, and back home again.
The issue with Apollo is just that it set expectations so strongly in one direction, and left NASA so geared up to pursue human spaceflight, that it was difficult to shift gears after the moon landing was accomplished. Important scientific and aerospace technology research has continued at numerous NASA Centers around the country (think Mars Rover, satellite and GPS technology, and a host of telescopes, safety technology, and aircraft design and efficiency improvements). But the human space flight side of NASA continued to get a big chunk of the budget pie, even after the Apollo program concluded and there wasn't another clear goal for humans to accomplish in space.
But if our focus never shifted to the amazing scientific discoveries that might have been found, it's at least in large part because what drove the Apollo program--as President Kennedy made abundantly clear in that speech 50 years ago--wasn't science. It was a strategic blow against the Soviet Union, and for the achievements of democracy, in a world where communism was seen as a real and growing threat. Period. Paragraph. End of discussion.
Still--one of the many intriguing parts of Kennedy's speech (and there are many) is how strongly he stressed to Congress and the American people that if they were not willing to sacrifice for this goal, and commit fully to its achievement, no matter what it took, then it would be better not to attempt it at all.
"If we are to go only half way, or reduce our sights in the face of difficulty, in my judgment it would be better not to go at all," Kennedy said. "There is no sense in agreeing or desiring that the United States take an affirmative position in outer space, unless we are prepared to do the work and bear the burdens to make it successful."
Of course, it was easier to say that in 1961, before NASA had as many Center and work forces whose jobs would be endangered if the nation decided that, in fact, it would rather not bear all those burdens and pay all those costs.
But 50 years later, Kennedy's point is still valid. Some of the work in low-earth orbit that NASA used to do is being handed off to private industry. The great promise of NASA's current space program is now in the field of technology advancement and exploratory science. Of course, those developments might lead, some day, to another clear goal worth pursuing in-person, an exotic, distant place brought almost within reach that's worth a mighty, focused effort for humans to go explore.
But the true challenge Kennedy threw down in that 1961 speech still applies. Without a Soviet rival to "race," and without the imperative of a cold war threat to counter, do we really care enough about space for science and exploration's sake to pay the costs and bear the burdens for that effort to bear dramatic fruit? The jury is still out on that one, in part because I don't know that the country's been asked to sacrifice much for NASA's scientific efforts. But in any event, as Kennedy said, we shouldn't attempt something halfway. We should figure out what scientific, engineering, or technology goals we really do care enough about to pursue, get excited about, and focus on carrying those through to completion--and let the rest go.
The part of that 1961 speech that Kennedy is remembered for is the moon challenge. But his challenge to Congress and the nation to think about whether or not space was worth the effort, and to walk away unless "every scientist, every engineer, every serviceman, every technician, contractor, and civil servant gives his pledge that this nation will move forward ... [without] undue work stoppages, inflated costs of material or talent, wasteful interagency rivalries, or a high turnover of key personnel" is the part of the speech that has the most lasting relevance.
What goal, if any, do we care enough about to commit to that fully? Fifty years later, the question still lingers in the air, awaiting an answer again.
Einstein’s gravitational waves rest on a genuinely radical idea.
After decades of anticipation, we have directly detected gravitational waves—ripples in spacetime traveling at the speed of light through the universe. Scientists at LIGO (the Laser Interferometic Gravitational-wave Observatory) have announced that they have measured waves coming from the inspiral of two massive black holes, providing a spectacular confirmation of Albert Einstein’s general theory of relativity, whose hundredth anniversary was celebrated just last year.
Finding gravitational waves indicates that Einstein was (once again) right, and opens a new window onto energetic events occurring around the universe. But there’s a deeper lesson, as well: a reminder of the central importance of locality, an idea that underlies much of modern physics.
The bureau successfully played the long game in both cases.
The story of law enforcement in the Oregon standoff is one of patience.
On the most obvious level, that was reflected in the 41 days that armed militia members occupied the Malheur National Wildlife Refuge near Burns. It took 25 days before the FBI and state police moved to arrest several leaders of the occupation and to barricade the refuge. It took another 15 days before the last of the final occupiers walked out, Thursday morning Oregon time.
Each of those cases involved patience as well: Officers massed on Highway 395 didn’t shoot LaVoy Finicum when he tried to ram past a barricade, nearly striking an FBI agent, though when he reached for a gun in his pocket they finally fired. Meanwhile, despite increasingly hysterical behavior from David Fry, the final occupier, officers waited him out until he emerged peacefully.
Most people know how to help someone with a cut or a scrape. But what about a panic attack?
Here’s a thought experiment: You’re walking down the street with a friend when your companion falls and gashes her leg on the concrete. It’s bleeding; she’s in pain. It’s clear she’s going to need stitches. What do you do?
This one isn’t exactly a head-scratcher. You'd probably attempt to offer some sort of first-aid assistance until the bleeding stopped, or until she could get to medical help. Maybe you happen to have a Band-Aid on you, or a tissue to help her clean the wound, or a water bottle she can use to rinse it off. Maybe you pick her up and help her hobble towards transportation, or take her where she needs to go.
Here’s a harder one: What if, instead of an injured leg, that same friend has a panic attack?
Ben Stiller’s follow-up to his own comedy classic is a downright bummer, no matter how many celebrity cameos it tries to cram in.
You don’t need to go to the theater to get the full experience of Zoolander 2. Simply get your hands on a copy of the original, watch it, and then yell a bunch of unfunny topical lines every time somebody tells a joke. That’s how it feels to watch Ben Stiller’s sequel to his 2001 spoof of the fashion industry: Zoolander 2 takes pains to reference every successful gag you remember from the original, and then embellish them in painful—often offensive, almost always outdated—fashion. It’s a film that has no real reason to exist, and it spends its entire running time reaffirming that fact.
The original Zoolander, to be fair, had no business being as funny as it was—it made fun of an industry that already seems to exist in a constant state of self-parody, and much of its humor relied on simple malapropisms and sight gags. But it was hilarious anyway as a candid snapshot of the fizzling-out of ’90s culture. Like almost any zeitgeist comedy, it belonged to a particular moment—and boy, should it have stayed there. With Zoolander 2, Stiller (who directed, co-wrote, and stars) tries to recapture the magic of 2001 by referencing its past glories with increasing desperation, perhaps to avoid the fact that he has nothing new to say about the fashion industry or celebrity culture 15 years laters.
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
By mining electronic medical records, scientists show the lasting legacy of prehistoric sex on modern humans’ health.
Modern humans originated in Africa, and started spreading around the world about 60,000 years ago. As they entered Asia and Europe, they encountered other groups of ancient humans that had already settled in these regions, such as Neanderthals. And sometimes, when these groups met, they had sex.
We know about these prehistoric liaisons because they left permanent marks on our genome. Even though Neanderthals are now extinct, every living person outside of Africa can trace between 1 and 5 percent of our DNA back to them. (I am 2.6 percent Neanderthal, if you were wondering, which pales in comparison to my colleague James Fallows at 5 percent.)
This lasting legacy was revealed in 2010 when the complete Neanderthal genome was published. Since then, researchers have been trying to figure out what, if anything, the Neanderthal sequences are doing in our own genome. Are they just passive hitchhikers, or did they bestow important adaptations on early humans? And are they affecting the health of modern ones?
Jim Gilmore joins Chris Christie and Carly Fiorina, and leaves the race after a poor showing in New Hampshire.
Jim Gilmore’s candidacy this year was improbable—but even more improbable was the minor cult of personality that developed around it.
The former Virginia governor never had a chance. Not, like, in the sense of Lindsey Graham, a candidate with national standing but no path to the presidency. More in the George Pataki sense: a guy who had no real business in race, but was running anyway. Except that Gilmore made Pataki look like a juggernaut. Also, Pataki saw the writing on the wall and had the sense to drop out in late December. Gilmore soldiered on, and ended up as the last of the truly longshots to leave.
The result was that Gilmore turned into a sort of folk hero. Not for voters, mind you—he managed only 12 votes in Iowa and 125 in New Hampshire, and his campaign was funded largely by loans from himself. Because of his low support in the polls, Gilmore only made the cut for the very first kid’s-table debate in August, and then again for the undercard in late January. Other than that, he was shut out completely.
A robotic road safety worker in India, a sacrificial llama in Bolivia, a sea otter receives a valentine, a deadly earthquake in Taiwan, a leopard attack in India, and much more.
A murmuration of starlings over Israel, a robotic road safety worker in India, a sacrificial llama in Bolivia, border barriers between Tunisia and Libya, a sea otter receives a valentine, a deadly earthquake in Taiwan, the annual Shrovetide football match in England, a leopard attack in India, and much more.
The country’s growth is slowing. The wrong response might make the problem worse.
An anxious superpower is confounded by a troubled economy. For a generation, its growth has been envied; now that growth is decelerating sharply. For decades, it has shaped and guided its economy via tight control of its banks; now that lever is malfunctioning. For years, it has carefully managed its exchange rate and limited the flow of capital across its borders; now the dam is cracking. To anyone who keeps up with the news, the superpower would seem easy to identify: China. But for those with a long memory, it could just as well be the United States of the Nixon era.
Like China today, the United States of the 1970s experienced an abrupt economic slowdown. Its economy had expanded by 4.4 percent a year, on average, during the go-go ’50s and ’60s, but growth slowed by about one-quarter during the following decade, to 3.2 percent a year. Even though growth of more than 3 percent may sound robust by today’s standards, at the time it felt ghastly. Time magazine lamented in 1974 that “middle-class people are being pushed into such demeaning economies as buying clothes at rummage sales”; a year or so later, its cover asked, “Can Capitalism Survive?” In September 1975, after President Gerald Ford survived two attempts on his life in quick succession, an adviser named Alan Greenspan responded with a memo about the “nihilism, radicalism, and violence” that seemed to grip some Americans. When New York City flirted with bankruptcy, its plight was taken as a symbol of broader moral and cultural decay.