The Soviets might have landed on two planets to America's one, but the extent of the ultimate U.S. space victory is a sort of metaphor for the Cold War and its resolution.
The Venera 7, left, landed on Venus in 1970. The Venera 13 took this image of the planet in 1982. (Wikimedia)
In the end, when the nuclear warheads were taken off alert and the borders of Europe and Asia redrawn, history recorded the Cold War as a great American victory. It won the arms race and it won Europe; its economic and political models both triumphed; and it won the war of ideology, with democracy displacing communism and totalitarianism across most of the globe. But there's one arena where the Cold War looked a bit closer to a tie: space.
The Soviet Union was the first to put a satellite in space, the first to put a person in space, the first to land a spacecraft on the moon, and the first -- and only -- to land on Venus. The U.S. was the first to put a person the moon, the first to do flybys of Mars, Venus, and Jupiter, and the first -- but not only -- to land on Mars, most recently with today's Curiosity. (The European Space Agency later got into the game by landing a probe on Titan, a moon orbiting Saturn, in 2005 with assistance from a U.S. spacecraft.) I don't know whether or how you can declare a winner from those two records, but one thing is clear: 20 years after the collapse of the Soviet Union and all it stood for, the U.S. has not met the Soviet record on number of planet surfaces visited.
Of course, space exploration isn't about beating the Soviets anymore, so the U.S. would have little to gain by visiting another planet just to say we did. And, when it comes to actual scientific knowledge gained and height of technological achievement, the Soviet edge is as broken and gone as the Berlin Wall. Still, this old, unchanged record is a reminder of the Soviet Union's deep mark on history, and that it wasn't so long ago that space, an area of global American leadership today, was closely contested, another front in the all-consuming Cold War.
The first manmade object to ever soft-land on another planet was the Soviet-made Venera 7. It launched from an Earth-orbit satellite on August 17, 1970, just over a year after Neil Armstrong walked on the moon, and entered the Venusian atmosphere on December 15. The Soviet command received 23 minutes of faint signals, the first data beamed from the surface of another world. In 1975, it landed the more successful Venera 9 and Venera 10, which sent back the first photos. The Venera program returned soil samples and color, panoramic views in 1981 and again in 1985. The U.S. never attempted to land on Venus, but it has sent orbiters, including 1978's Pioneer Venus 1, which dropped three small probes into the atmosphere.
The Soviet Union might have won the race to Venus, but Mars was more contested. In May 1971, as a proxy war in Vietnam raged, the U.S. and Soviet Union hurled five satellites toward the red planet. Mariner 8 and Kosmos 419 fizzled, but on November 13 the American Mariner 9 became the first vessel to enter another planet's orbit. Two weeks later, the Soviet Union's Mars 2 followed into orbit, with the Mars 3 a few days behind. The U.S. satellite took over 100 times as many photos as the two Soviet ships, but Mars 2 and Mars 3 both carried landers. The first crashed; the second achieved the first-ever landing on Mars. But it lasted only 20 seconds, after which its instruments shut down, possibly due to a dust storm.
Both the U.S. and Soviet Union tried a number of Mars landers after that, but the Americans had far more success. In 1974, the Soviet Union had another disappointment with the Mars 6, which landed successfully but sent back bad data due to a computer chip problem, and the Mars 7, which simply missed. The U.S. landed the Viking in 1976, and later upgraded to rovers with the 1997 Sojourner, 2004 Spirit and Opportunity, and 2012 Curiosity. A Soviet vessel never again successfully touched down, despite two 1988 attempts.
In a way, the planetary race can be seen as a metaphor for the Cold War itself. The competition might have been nail-bitingly close at the time, with the Soviet Union taking some historic leaps ahead of the Americans, a few of which are still with us. In the end, though, not only did the U.S. win, but the extent of is victory has surely surpassed even the wildest dreams of either Nixon or Khrushchev.
The tension between religious liberty and same-sex marriage may eventually come to a head in the courts, but probably not through the Kentucky clerk’s case.
As Rowan County clerk Kim Davis crawls further and further out on a limb, Supreme Court experts agree that she has little chance of prevailing. District Judge David Bunning, on August 12 ordered Davis, in her capacity as county clerk, to issue marriage licenses to all couples who meet the statutory criteria for marriage in Kentucky—a definition that, since the Court’s landmark decision in Obergefell v. Hodges, includes same-sex couples.
Davis has refused, citing “the authority of God.” The U.S. Supreme Court, without comment, denied her emergency request for a stay. This throws the case back to the Sixth Circuit, which will hear the appeal of Judge Bunning’s order. Assuming she loses in the Sixth Circuit—a fairly good assumption—she would then have the alternative of petitioning the Supreme Court to hear her religious freedom claim. The Court will eventually hear a case about religious freedom and same-sex marriage, but I don’t think it will be this one.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The past is beautiful until you’re reminded it’s ugly.
Taylor Swift’s music video for “Wildest Dreams” isn’t about the world as it exists; it’s about the world as seen through the filter of nostalgia and the magic of entertainment. In the song, Swift sings that she wants to live on in an ex’s memory as an idealized image of glamour—“standing in a nice dress, staring at the sunset.” In the video, her character, an actress, falls in love with her already-coupled costar, for whom she’ll live on as an idealized image of glamour—standing in a nice dress, staring at a giant fan that’s making the fabric swirl in the wind.
The setting for the most part is Africa, but, again, the video isn’t about Africa as it exists, but as it’s seen through the filter of nostalgia and the magic of entertainment—a very particular nostalgia and kind of entertainment. Though set in 1950, the video is in the literary and cinematic tradition of white savannah romances, the most important recent incarnation of which might be the 1985 Meryl Streep film Out of Africa, whose story begins in 1913. Its familiarity is part of its appeal, and also part of why it’s now drawing flack for being insensitive. As James Kassaga Arinaitwe and Viviane Rutabingwa write at NPR:
A Brooklyn-based group is arguing that the displacement of longtime residents meets a definition conceived by the United Nations in the aftermath of World War II.
No one will be surprised to learn that the campaign to build a national movement against gentrification is being waged out of an office in Brooklyn, New York.
For years, the borough’s name has been virtually synonymous with gentrification, and on no street in Brooklyn are its effects more evident than on Atlantic Avenue, where, earlier this summer, a local bodega protesting its impending departure in the face of a rent hike, put up sarcastic window signs advertising “Bushwick baked vegan cat food” and “artisanal roach bombs.”
Just down the block from that bodega are the headquarters of Right to the City, a national alliance of community-based organizations that since 2007 has made it its mission to fight “gentrification and the displacement of low-income people of color.” For too long, organizers with the alliance say, people who otherwise profess concern for the poor have tended to view gentrification as a mere annoyance, as though its harmful effects extended no further than the hassles of putting up with pretentious baristas and overpriced lattes. Changing this perception is the first order of business for Right to the City: Gentrification, as these organizers see it, is a human-rights violation.
But letting customers buy their own would force cable companies to improve their equipment.
One of the least glamorous realities of the American cable industry is a relic invented in 1948: the cable box. The box has become a fixture in the American household, not least because it is surprisingly profitable. Earlier this year, a U.S. Senate study found that American households pay $231 a year on average renting cable boxes. Further, the report estimated that 99 percent of cable customers rented their equipment, and, across the country, that added up to a $19.5 billion industry just renting cable boxes.
The senators who commissioned the study, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, noted that this dependable rental revenue gave the industry little incentive to innovate and make better cable boxes. Which begs a really good question: Why aren’t more people purchasing their cable boxes?
Though it wasn’t pretty, Minaj was really teaching a lesson in civility.
Nicki Minaj didn’t, in the end, say much to Miley Cyrus at all. If you only read the comments that lit up the Internet at last night’s MTV Video Music Awards, you might think she was kidding, or got cut off, when she “called out” the former Disney star who was hosting: “And now, back to this bitch that had a lot to say about me the other day in the press. Miley, what’s good?”
To summarize: When Minaj’s “Anaconda” won the award for Best Hip-Hop Video, she took to the stage in a slow shuffle, shook her booty with presenter Rebel Wilson, and then gave an acceptance speech in which she switched vocal personas as amusingly as she does in her best raps—street-preacher-like when telling women “don’t you be out here depending on these little snotty-nosed boys”; sweetness and light when thanking her fans and pastor. Then a wave of nausea seemed to come over her, and she turned her gaze toward Cyrus. To me, the look on her face, not the words that she said, was the news of the night:
Learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic.
I’m not in favor of anyone learning to code unless she really wants to. I believe you should follow your bliss, career-wise, because most of the things you’d buy with all the money you’d make as a programmer won’t make you happy. Also, if your only reason for learning to code is because you want to be a journalist and you think that’s the only way to break into the field, that’s false.
I’m all for people not becoming coders, in other words—as long they make that decision for the right reasons. “I’m bad at math” is not the right reason.
Math has very little to do with coding, especially at the early stages. In fact, I’m not even sure why people conflate the two. (Maybe it has to do with the fact that both fields are male-dominated.)
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
Actually, a good amount: Belittling their plight by comparing it to blue-collar workers’ ignores the trickle-down harms of an exhausting work culture.
Over the past few decades, workers without college degrees have not only seen jobs disappear and wages stagnate—the jobs that remain have, all too often, gotten worse. Constant surveillance is common; schedules are erratic; escalating performance quotas exact faster work. But these trends, often thought to be confined to front-line workers, have creeped up corporate hierarchies, affecting managers and executives. That’s prompted a new controversy: Are white-collar workers victims of exploitation, or merely whining?
A devastating report on the work culture at Amazon’s headquarters recently reignited the debate. The New York Times’s August exposé, based on dozens of interviews, portrayed a firm with all the regimentation and rigidity of military boot camp, minus the esprit de corps. Workers routinely cried at their desks. Rather than being comforted or accommodated, sick employees were dumped into Orwellianly named “Performance Improvement Plans” that simply hastened their eventual departures. Faced with a comprehensive employee-ranking system, cabals of managers agreed to praise one another while talking down the performance of others. Amazon’s “collaborative feedback tool” encouraged a Panopticon of vicious feedback—and similar software may be coming to many more firms.