The Soviets might have landed on two planets to America's one, but the extent of the ultimate U.S. space victory is a sort of metaphor for the Cold War and its resolution.
The Venera 7, left, landed on Venus in 1970. The Venera 13 took this image of the planet in 1982. (Wikimedia)
In the end, when the nuclear warheads were taken off alert and the borders of Europe and Asia redrawn, history recorded the Cold War as a great American victory. It won the arms race and it won Europe; its economic and political models both triumphed; and it won the war of ideology, with democracy displacing communism and totalitarianism across most of the globe. But there's one arena where the Cold War looked a bit closer to a tie: space.
The Soviet Union was the first to put a satellite in space, the first to put a person in space, the first to land a spacecraft on the moon, and the first -- and only -- to land on Venus. The U.S. was the first to put a person the moon, the first to do flybys of Mars, Venus, and Jupiter, and the first -- but not only -- to land on Mars, most recently with today's Curiosity. (The European Space Agency later got into the game by landing a probe on Titan, a moon orbiting Saturn, in 2005 with assistance from a U.S. spacecraft.) I don't know whether or how you can declare a winner from those two records, but one thing is clear: 20 years after the collapse of the Soviet Union and all it stood for, the U.S. has not met the Soviet record on number of planet surfaces visited.
Of course, space exploration isn't about beating the Soviets anymore, so the U.S. would have little to gain by visiting another planet just to say we did. And, when it comes to actual scientific knowledge gained and height of technological achievement, the Soviet edge is as broken and gone as the Berlin Wall. Still, this old, unchanged record is a reminder of the Soviet Union's deep mark on history, and that it wasn't so long ago that space, an area of global American leadership today, was closely contested, another front in the all-consuming Cold War.
The first manmade object to ever soft-land on another planet was the Soviet-made Venera 7. It launched from an Earth-orbit satellite on August 17, 1970, just over a year after Neil Armstrong walked on the moon, and entered the Venusian atmosphere on December 15. The Soviet command received 23 minutes of faint signals, the first data beamed from the surface of another world. In 1975, it landed the more successful Venera 9 and Venera 10, which sent back the first photos. The Venera program returned soil samples and color, panoramic views in 1981 and again in 1985. The U.S. never attempted to land on Venus, but it has sent orbiters, including 1978's Pioneer Venus 1, which dropped three small probes into the atmosphere.
The Soviet Union might have won the race to Venus, but Mars was more contested. In May 1971, as a proxy war in Vietnam raged, the U.S. and Soviet Union hurled five satellites toward the red planet. Mariner 8 and Kosmos 419 fizzled, but on November 13 the American Mariner 9 became the first vessel to enter another planet's orbit. Two weeks later, the Soviet Union's Mars 2 followed into orbit, with the Mars 3 a few days behind. The U.S. satellite took over 100 times as many photos as the two Soviet ships, but Mars 2 and Mars 3 both carried landers. The first crashed; the second achieved the first-ever landing on Mars. But it lasted only 20 seconds, after which its instruments shut down, possibly due to a dust storm.
Both the U.S. and Soviet Union tried a number of Mars landers after that, but the Americans had far more success. In 1974, the Soviet Union had another disappointment with the Mars 6, which landed successfully but sent back bad data due to a computer chip problem, and the Mars 7, which simply missed. The U.S. landed the Viking in 1976, and later upgraded to rovers with the 1997 Sojourner, 2004 Spirit and Opportunity, and 2012 Curiosity. A Soviet vessel never again successfully touched down, despite two 1988 attempts.
In a way, the planetary race can be seen as a metaphor for the Cold War itself. The competition might have been nail-bitingly close at the time, with the Soviet Union taking some historic leaps ahead of the Americans, a few of which are still with us. In the end, though, not only did the U.S. win, but the extent of is victory has surely surpassed even the wildest dreams of either Nixon or Khrushchev.
The comedian's n-bomb at the White House Correspondents’ Dinner highlights a generational shift in black culture.
Georgia McDowell was born the daughter of farmers and teachers in North Carolina in 1902. She was my great-grandmother, and she taught me to read, despite the dementia that clouded her mind and the dyslexia that interrupted mine. I loved Miss Georgia, though she kept as many hard lines in her home as she had in her classrooms. One of the hardest lines was common to many black households: The word “nigger” and all of its derivatives were strict taboos in person, on television, and on radio from any source, black or otherwise, so long as she lived and breathed. She’d kept the taboo through decades of teaching black students and raising black children. For most of my childhood, the taboo was absolute.
When Apple announced in 2013 that its next iPhone would include a fingerprint reader, it touted the feature as a leap forward in security. Many people don’t set up a passcode on their phones, Apple SVP Phil Schiller said at the keynote event where the Touch ID sensor was unveiled, but making security easier and faster might convince more users to protect their phones. (Of course, Apple wasn’t the first to stuff a fingerprint reader into a flagship smartphone, but the iPhone’s Touch ID took the feature mainstream.)
The system itself proved quite secure—scanned fingerprints are stored, encrypted, and processed locally rather than being sent to Apple for verification—but the widespread use of fingerprint data to unlock iPhones worried some experts. One of the biggest questions that hung over the transition was legal rather than technical: How might a fingerprint-secured iPhone be treated in a court of law?
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
The billionaire’s bid for the nomination was opposed by many insiders—but his success reveals the ascendance of other elements of the party coalition.
In The Party Decides, an influential book about how presidential nominees are selected, political scientists John Zaller, Hans Noel, David Karol, and Marty Cohen argue that despite reforms designed to wrest control of the process from insiders at smoke-filled nominating conventions, political parties still exert tremendous influence on who makes it to general elections. They do so partly through “invisible primaries,” the authors posited—think of how the Republican establishment coalesced around George W. Bush in 2000, long before any ballots were cast, presenting him as a fait accompli to voters who’d scarcely started to think about the election; or how insider Democrats elevated Hillary Clinton this election cycle.
For some, abandoning expensive urban centers would be a huge financial relief.
Neal Gabler has been a formative writer for me: His Winchell: Gossip, Power, and the Culture of Celebrity was one of the books that led me to think about leaving scholarship behind and write nonfiction instead, and Walt Disney: The Triumph of the American Imagination was the first book I reviewed as a freelance writer. To me, he exemplifies the best mix of intensive archival research and narrative kick.
So reading his recent essay, "The Secret Shame of Middle-Class Americans," was a gut punch: First, I learned about a role model of mine whose talent, in my opinion, should preclude him from financial woes. And, then, I was socked by narcissistic outrage: I, too, struggle with money! I, too, am a failing middle-class American! I, too, am a writer of nonfiction who should be better compensated!
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
The Massachusetts Supreme Court will decide whether a local shrine should be tax-exempt—a decision that could have broad implications for faith organizations in America.
Property-tax battles are rarely sexy. But a case now in front of the Massachusetts Supreme Judicial Court, about whether the 21 religious brothers and sisters who run the Shrine of Our Lady of LaSalette in Attleboro should have to pay taxes, could have huge repercussions. The Court’s decision will be an important part of the ongoing debate in America about who defines religious practice—believers or bureaucrats—and whether religion itself should be afforded a special place under the law.
The case centers on a colonial-era law in Massachusetts that exempts religious houses of worship and parsonages from property taxes if they are used for religious worship or instruction. The shrine has enjoyed this perk since its founding in 1953. But in recent years, the City of Attleboro, nestled between Providence and Boston, has faced a tightening budget. It began looking to see where it could collect more revenue. The shrine, the only major tourist attraction in town, was an obvious target for tax collectors.
Three Atlantic staffers discuss “Home,” the second episode of the sixth season.
Every week for the sixth season of Game of Thrones, Christopher Orr, Spencer Kornhaber, and Lenika Cruz will be discussing new episodes of the HBO drama. Because no screeners are being made available to critics in advance this year, we'll be posting our thoughts in installments.