As much as we play up the importance of scientific research, President Obama's NASA budget shows that it's the risky human side of the space program that draws attention and funding for the nation's space program.
This week marks the 50th anniversary of John Glenn's Friendship 7 space flight--the third in NASA's Mercury space program and the first of those flights to successfully orbit the Earth. Coming as it does, only a week after President Obama released his 2013 budget priorities for NASA, the milestone anniversary, with all its triumphant photos and memories, provides a reminder of why the new NASA budget is skewed the way it is. It also says something, for better or for worse, about what most of us prefer, when it comes to great undertakings.
Since its inception in 1958, the space side of NASA has had a dual personality, in more ways than one. The biggest duality has been the obvious split between "manned" and "unmanned" missions, which paralleled to a large degree a second split between science and engineering.
Even scientific satellites require engineering know-how to actually reach space or perform experiments there. But the "manned" efforts (or "human spaceflight" missions, as they are now generally called) have always been primarily engineering challenges. My uncle's former father-in-law worked for the rocket manufacturer Rocketdyne during NASA's glory days of Mercury, Gemini and Apollo. And one of his favorite phrases, in fact, was, "there is no such thing as a rocket scientist."
Aside from the obvious human element, the difference between scientific and "manned" missions, is the end result. Successful scientific missions bring back, or enable, discoveries: greater knowledge about science, the universe, and the planet we call home. In contrast, the success of human spaceflight missions has been counted primarily in humanachievements: the first man off the planet, to orbit the Earth, to orbit the moon, or to land on the moon and return safely to Earth. We proved we could build and successfully operate (with a couple of glaring exceptions) reusable spacecraft that landed on a runway. We set endurance records for humans living in space. We proved we could build something in space.
Scientific satellites are also engineering achievements, of course. But we don't sell planetary probes as a way of proving our human greatness. We sell them as a way to discover more about Mars, or Jupiter's moons, and about whether life ever existed there. The emphasis of the scientific missions, in other words, is on the intrinsic value of knowledge they produce, which is to say, on something other than us.
And therein lies the crux of the problem with scientific missions. Or, at least, the problem when it comes to getting public funding and support.
President Obama's proposed 2013 budget trims NASA's overall budget, but only by a small amount. The noticeable shift is that it reduces funding for scientific planetary missions by 20 percent, while almost doubling the budget for continued work on future human spaceflight missions. Almost $3 billion is being allocated to further development of a heavy-lift booster rocket and the Orion Multi-Purpose Crew Vehicle. Another $3 billion is slated for continued support of the space station, even though that project has received enormous criticism for how little return on investment it has produced, overall. Story Musgrave, one of NASA's most experienced veteran astronauts, even called it little more than a "jobs program" and a "$100 billion mistake."
Planetary science missions, done remotely with spacecraft and robots, are far less costly. Yet, at the same time as the budget for human spaceflight is increasing, the 2013 budget calls for a reduction in planetary science mission funding from $1.5 billion to $1.2 billion. Why?
One could argue, of course, that discovering water, or traces of microscopic life, on Jupiter's moon, Europa, will not transform our understanding of life or the universe. And that might very well be true. But if the standard for funding was missions that they offer transformative knowledge of life or the universe, flying astronauts back to the Moon or to Mars (as opposed to highly capable robots) wouldn't pass the bar, either. What those human missions do provide are athlete-heroes to cheer.
Looking at the news photos of John Glenn, riding in a ticker-tape parade with President Kennedy after his successful orbital flight, it's easy to see why human spaceflight gets so much more funding and support. "In the winter of 1962," the opening line in a New York Times article about the anniversary began, "the nation needed a hero."
For as much as we try to play up the science fair whiz kids who create robots and technology, we're still very attached to the explorer/athlete/star champion model of hero. Designing a robot to explore Mars is a kind of "team personality" achievement: an effort by a team player and builder who works in concert with others to put something or someone else forward (in this case, a robot or satellite) to get the glory. And we still get much more satisfaction in cheering on the star who actually does the glorious deed themselves. Especially if the deed involves physical feats or physical risks to self. We idolize the quarterback, not the lineman who makes it possible for the quarterback to make that play. The race driver, not the crew. The player who scores the basket, not the guard who makes the assist. The brave astronaut who repairs the Hubble Space Telescope in space, rather than the guy who designed the fix in the first place.
In the case of robotic or satellite missions in space, the human achievement is primarily mental, and takes place on the ground, in a lab, with lots of career and project risk, but little physical danger. And the big end prize that comes out of the process is the esoteric reward of knowledge. That doesn't quite match the thrill of our hero winning an Olympic Gold Medal or our team winning the Super Bowl or the World Series.
In the 1980s, the television show Cheers, which revolved around a neighborhood bar in Boston, opened with a series of vintage photos from real local watering holes. The image I remember best shows a beaming bartender holding up a newspaper with a 4-inch banner headline across the top proclaiming, "WE WIN!!!!!" Imagine a similar headline proclaiming,instead, "WE LEARN!!!!!!" Right. You can't. And that's the point.
Discovery is about expanding our understanding of something else. Achievement is a much more satisfying ego stroke about ourselves. Our heroes are the stand-ins for ourselves; for what we get to see we are capable of doing. And physical achievements--for whatever reasons we still prize the physical so highly--get us more excited than academic ones. Perhaps physical achievements are easier to get our hands and minds around. Or perhaps it's the competitive element that many of those physical achievements contain. We beat the Russians, or we bested Nature, or we bested ... well, something. Whatever the reason, the truth remains ...we may give academic achievers prizes for enabling discoveries, but we don't give them 4-inch banner headlines or ticker-tape parades.
Keeping a human alive in space is far more costly and complex than sending a robot on the same mission. There is, to be sure, an argument that in the process of designing the life systems to sustain a human crew all the way to Mars and back, for example, we will further technology to a point where we can then figure out how to make a more distant step possible. On the other hand, there's a pretty strong argument to be made for pushing the boundaries first robotically--both to develop the physics, propulsion and materials technology to make deep space travel possible at a much more reasonable cost, and also to explore what parts or objects in space might be worth following up on with a human mission.
There are other factors in the decision, of course. The human spaceflight side of NASA creates a lot of jobs, in a lot of states. So shelving it for the foreseeable future would have serious political and economic ramifications, which no politician wants to face. But it would also require us to readjust our notions of what's worth a 4-inch headline. And I'm not sure we're there, yet.
Could we change that? Maybe. But it's not simply a rational issue of the best investment of funds for NASA. It goes much deeper than that. The fact that we get more excited about competitive endeavors that have a human at the center of them, and entail real, physical risks and consequences, might make us slightly egotistic, or self-centered, or even primitive in some way. But it is also an inclination that is, for better or worse, very human--and goes back in history a very long time.
The Republican Party laid the groundwork for dysfunction long before Donald Trump was elected president.
President Trump’s approach to governance is unlike that of his recent predecessors, but it is also not without antecedents. The groundwork for some of this dysfunction was laid in the decades before Trump’s emergence as a political figure. Nowhere is that more true than in the disappearance of the norms of American politics.
Norms are defined as “a standard or pattern, especially of social behavior, that is typical or expected of a group.” They are how a person is supposed to behave in a given social setting. We don’t fully appreciate the power of norms until they are violated on a regular basis. And the breaching of norms often produces a cascading effect: As one person breaks with tradition and expectation, behavior previously considered inappropriate is normalized and taken up by others. Donald Trump is the Normless President, and his ascendancy threatens to inspire a new wave of norm-breaking.
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
The right’s old guard faces an existential threat in populism. But it isn’t yet clear that they understand the stakes or possess the confidence to fight back.
Donald Trump’s rise to power put National Review, The Weekly Standard, and the sorts of journalists who work there in a distressing bind. Neither the president nor the #MAGA loyalists who staff his White House adhere to conservative principles. Yet many donors, subscribers, and readers who sustain their publications prefer Trump’s blustering, bombastic project, massively shifting the center of gravity on the right.
Tribalist populism is ascendant––and conservative publications no longer thereby benefit, in part because newer magazines and web sites are more closely aligned with it.
During the 1950s, when the postwar governing establishment presumed a liberal consensus and the right was as internally divided as it is now, William F. Buckley built a competing coalition in part by winning converts on the right to conservatism, famously declaring himself to be standing athwart history yelling, “Stop!”
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
What was it like inside the brain of an ancient prophet?
James Kugel has been spent his entire scholarly career studying the Bible, but some very basic questions about it still obsess him. What was it about the minds of ancient Israelites that allowed them to hear and see God directly—or at least, to believe that they did? Were the biblical prophets literally hearing voices and seeing visions, understanding themselves to be transmitting God’s own exact words? If so, why did such direct encounters with God become rarer over time?
In his new and final book, The Great Shift, Kugel investigates these questions through the lens of neuroscientific findings. (The approach is reminiscent of other recent books, like Kabbalah: A Neurocognitive Approach to Mystical Experiences, co-written by a neurologist and a mysticism scholar.) First, Kugel uses biblical research to show that ancient people had a “sense of self” that was fundamentally different from the one modern Westerners have—and that this enabled them to experience and interpret prophecy differently than we do. Then he uses scientific research to show that we shouldn’t assume their view was wrong. If anything, our modern Western notion of the bounded, individual self is the anomaly; most human beings throughout history conceived of the self as a porous entity open to intrusions. In fact, much of the rest of the world today still does.
The first modern market crash, in 1987, reflected lasting changes in how Wall Street works. Regulators still haven’t adjusted.
Veterans of the stock market insist that the four most dangerous words on Wall Street are “this time is different.”
It rarely is. In the autumn of 1929, Irving Fisher, a prominent economist at Yale, assured Americans that stock prices had reached “what looks like a permanently high plateau.” That was on October 15, just days before the opening stumble in an epic crash that, by its nadir in 1931, would cut the American stock market’s value by almost 90 percent. In early 2000, exuberant tech analysts argued that revenues and profits were no longer the metrics that mattered in the internet age. Then the dot-com bubble burst and the technology-laden Nasdaq Composite Index fell almost 80 percent between March 2000 and October 2002. Clearly, when the market is soaring, it can be ruinous to believe that its highs are the mark of some fundamental financial shift.
Access to insurance isn’t erasing race- and class-based health disparities.
It’s often said that Americans are living “longer, healthier lives,” and while that’s true overall, white wealthy people are still far more likely to enjoy good health than other demographics in old age. A new research letter published inJAMA Internal Medicine Monday revealed clear racial, income, and educational disparities in the number of senior citizens who experience good health—and the gaps have only widened over time.
For this report, researchers from the University of Michigan looked at adults older than 65 who reported their health as “excellent” or “very good” at least twice within one calendar year, between 2000 and 2014. As a whole, seniors were feeling healthier by the end of that time period. However, white people and wealthy people were most likely to consider themselves very healthy. The most highly educated seniors consistently felt healthier than those with less education, and as the study went on, the health of that group only improved. In 2000, 57.4 percent of highly educated seniors considered themselves in very good or excellent health, and by 2014, that number had risen to 63 percent.
CNN and The New York Times on Monday shed new light on the federal investigation into the former Trump campaign chairman, but some aspects of the probe remain mysterious.
Special Counsel Robert Mueller’s investigation into Paul Manafort, the onetime Trump campaign chairman, appears to be moving far more aggressively than any other part of the sprawling Russia investigation.
On Monday night, The New York Timesreported new details about the July raid on Manafort’s Virginia home. Federal investigators broke into his house in the early morning hours, seizing documents and computer files about his extensive business dealings. The Times also reported that Muller’s team told Manafort that he would soon be indicted, the first indication to date that the former FBI director will ultimately file criminal charges in his sweeping probe.
Soon thereafter, CNN reported that Manafort had been the target of a special warrant issued under the Foreign Intelligence Surveillance Act, or FISA, before and after the 2016 presidential election. The exact dates of his surveillance are unclear: CNN reported it began sometime after the FBI opened an investigation into his political consulting in 2014, ended in May of 2016, and then resumed sometime after the November election.
A good marriage is no guarantee against infidelity.
“Most descriptions of troubled marriages don’t seem to fit my situation,” Priya insists. “Colin and I have a wonderful relationship. Great kids, no financial stresses, careers we love, great friends. He is a phenom at work, fucking handsome, attentive lover, fit, and generous to everyone, including my parents. My life is good.” Yet Priya is having an affair. “Not someone I would ever date—ever, ever, ever. He drives a truck and has tattoos. It’s so clichéd, it pains me to say it out loud. It could ruin everything I’ve built.”
Priya is right. Few events in the life of a couple, except illness and death, carry such devastating force. For years, I have worked as a therapist with hundreds of couples who have been shattered by infidelity. And my conversations about affairs have not been confined within the cloistered walls of my therapy practice; they’ve happened on airplanes, at dinner parties, at conferences, at the nail salon, with colleagues, with the cable guy, and of course, on social media. From Pittsburgh to Buenos Aires, Delhi to Paris, I have been conducting an open-ended survey about infidelity.