The Soviets might have landed on two planets to America's one, but the extent of the ultimate U.S. space victory is a sort of metaphor for the Cold War and its resolution.
The Venera 7, left, landed on Venus in 1970. The Venera 13 took this image of the planet in 1982. (Wikimedia)
In the end, when the nuclear warheads were taken off alert and the borders of Europe and Asia redrawn, history recorded the Cold War as a great American victory. It won the arms race and it won Europe; its economic and political models both triumphed; and it won the war of ideology, with democracy displacing communism and totalitarianism across most of the globe. But there's one arena where the Cold War looked a bit closer to a tie: space.
The Soviet Union was the first to put a satellite in space, the first to put a person in space, the first to land a spacecraft on the moon, and the first -- and only -- to land on Venus. The U.S. was the first to put a person the moon, the first to do flybys of Mars, Venus, and Jupiter, and the first -- but not only -- to land on Mars, most recently with today's Curiosity. (The European Space Agency later got into the game by landing a probe on Titan, a moon orbiting Saturn, in 2005 with assistance from a U.S. spacecraft.) I don't know whether or how you can declare a winner from those two records, but one thing is clear: 20 years after the collapse of the Soviet Union and all it stood for, the U.S. has not met the Soviet record on number of planet surfaces visited.
Of course, space exploration isn't about beating the Soviets anymore, so the U.S. would have little to gain by visiting another planet just to say we did. And, when it comes to actual scientific knowledge gained and height of technological achievement, the Soviet edge is as broken and gone as the Berlin Wall. Still, this old, unchanged record is a reminder of the Soviet Union's deep mark on history, and that it wasn't so long ago that space, an area of global American leadership today, was closely contested, another front in the all-consuming Cold War.
The first manmade object to ever soft-land on another planet was the Soviet-made Venera 7. It launched from an Earth-orbit satellite on August 17, 1970, just over a year after Neil Armstrong walked on the moon, and entered the Venusian atmosphere on December 15. The Soviet command received 23 minutes of faint signals, the first data beamed from the surface of another world. In 1975, it landed the more successful Venera 9 and Venera 10, which sent back the first photos. The Venera program returned soil samples and color, panoramic views in 1981 and again in 1985. The U.S. never attempted to land on Venus, but it has sent orbiters, including 1978's Pioneer Venus 1, which dropped three small probes into the atmosphere.
The Soviet Union might have won the race to Venus, but Mars was more contested. In May 1971, as a proxy war in Vietnam raged, the U.S. and Soviet Union hurled five satellites toward the red planet. Mariner 8 and Kosmos 419 fizzled, but on November 13 the American Mariner 9 became the first vessel to enter another planet's orbit. Two weeks later, the Soviet Union's Mars 2 followed into orbit, with the Mars 3 a few days behind. The U.S. satellite took over 100 times as many photos as the two Soviet ships, but Mars 2 and Mars 3 both carried landers. The first crashed; the second achieved the first-ever landing on Mars. But it lasted only 20 seconds, after which its instruments shut down, possibly due to a dust storm.
Both the U.S. and Soviet Union tried a number of Mars landers after that, but the Americans had far more success. In 1974, the Soviet Union had another disappointment with the Mars 6, which landed successfully but sent back bad data due to a computer chip problem, and the Mars 7, which simply missed. The U.S. landed the Viking in 1976, and later upgraded to rovers with the 1997 Sojourner, 2004 Spirit and Opportunity, and 2012 Curiosity. A Soviet vessel never again successfully touched down, despite two 1988 attempts.
In a way, the planetary race can be seen as a metaphor for the Cold War itself. The competition might have been nail-bitingly close at the time, with the Soviet Union taking some historic leaps ahead of the Americans, a few of which are still with us. In the end, though, not only did the U.S. win, but the extent of is victory has surely surpassed even the wildest dreams of either Nixon or Khrushchev.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
A Brooklyn-based group is arguing that the displacement of longtime residents meets a definition conceived by the United Nations in the aftermath of World War II.
No one will be surprised to learn that the campaign to build a national movement against gentrification is being waged out of an office in Brooklyn, New York.
For years, the borough’s name has been virtually synonymous with gentrification, and on no street in Brooklyn are its effects more evident than on Atlantic Avenue, where, earlier this summer, a local bodega protesting its impending departure in the face of a rent hike, put up sarcastic window signs advertising “Bushwick baked vegan cat food” and “artisanal roach bombs.”
Just down the block from that bodega are the headquarters of Right to the City, a national alliance of community-based organizations that since 2007 has made it its mission to fight “gentrification and the displacement of low-income people of color.” For too long, organizers with the alliance say, people who otherwise profess concern for the poor have tended to view gentrification as a mere annoyance, as though its harmful effects extended no further than the hassles of putting up with pretentious baristas and overpriced lattes. Changing this perception is the first order of business for Right to the City: Gentrification, as these organizers see it, is a human-rights violation.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
But letting customers buy their own would force cable companies to improve their equipment.
One of the least glamorous realities of the American cable industry is a relic invented in 1948: the cable box. The box has become a fixture in the American household, not least because it is surprisingly profitable. Earlier this year, a U.S. Senate study found that American households pay $231 a year on average renting cable boxes. Further, the report estimated that 99 percent of cable customers rented their equipment, and, across the country, that added up to a $19.5 billion industry just renting cable boxes.
The senators who commissioned the study, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, noted that this dependable rental revenue gave the industry little incentive to innovate and make better cable boxes. Which begs a really good question: Why aren’t more people purchasing their cable boxes?
Why haven’t more challengers entered the race to defeat the Iraq War hawk, Patriot Act supporter, and close friend of big finance?
As Hillary Clinton loses ground to Bernie Sanders in Iowa, where her lead shrinks by the day, it’s worth noticing that she has never made particular sense as the Democratic Party’s nominee. She may be more electable than her social-democratic rival from Vermont, but plenty of Democrats are better positioned to represent the center-left coalition. Why have they let the former secretary of state keep them out of the race? If Clinton makes it to the general election, I understand why most Democrats will support her. She shares their views on issues as varied as preserving Obamacare, abortion rights, extending legal status to undocumented workers, strengthening labor unions, and imposing a carbon tax to slow climate change.
Learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic.
I’m not in favor of anyone learning to code unless she really wants to. I believe you should follow your bliss, career-wise, because most of the things you’d buy with all the money you’d make as a programmer won’t make you happy. Also, if your only reason for learning to code is because you want to be a journalist and you think that’s the only way to break into the field, that’s false.
I’m all for people not becoming coders, in other words—as long they make that decision for the right reasons. “I’m bad at math” is not the right reason.
Math has very little to do with coding, especially at the early stages. In fact, I’m not even sure why people conflate the two. (Maybe it has to do with the fact that both fields are male-dominated.)
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
Actually, a good amount: Belittling their plight by comparing it to blue-collar workers’ ignores the trickle-down harms of an exhausting work culture.
Over the past few decades, workers without college degrees have not only seen jobs disappear and wages stagnate—the jobs that remain have, all too often, gotten worse. Constant surveillance is common; schedules are erratic; escalating performance quotas exact faster work. But these trends, often thought to be confined to front-line workers, have creeped up corporate hierarchies, affecting managers and executives. That’s prompted a new controversy: Are white-collar workers victims of exploitation, or merely whining?
A devastating report on the work culture at Amazon’s headquarters recently reignited the debate. The New York Times’s August exposé, based on dozens of interviews, portrayed a firm with all the regimentation and rigidity of military boot camp, minus the esprit de corps. Workers routinely cried at their desks. Rather than being comforted or accommodated, sick employees were dumped into Orwellianly named “Performance Improvement Plans” that simply hastened their eventual departures. Faced with a comprehensive employee-ranking system, cabals of managers agreed to praise one another while talking down the performance of others. Amazon’s “collaborative feedback tool” encouraged a Panopticon of vicious feedback—and similar software may be coming to many more firms.
Conservatives want to defund the group, even if it means shutting down the government. And they’re holding the GOP leadership accountable.
It has become an annual harbinger of autumn in this era of divided government: The calendar swings from August to September, Congress returns from its long summer break, and Republican leaders try to figure out how to keep the federal lights on past the end of the month.
In 2013, John Boehner gave in to Senator Ted Cruz and his conservative allies in the House, and the government shut down for two weeks in a failed fight over Obamacare. A year ago, Boehner and Mitch McConnell succeeded in twice putting off a losing battle over immigration until after they could wrest control of the Senate from the Democrats.
With federal funding set to expire on September 30, conservatives are once again demanding a standoff that Boehner and McConnell are hell-bent on avoiding. This time around, the issue that might prevent an orderly—if temporary—extension of funding is Planned Parenthood. Along with Cruz, House conservatives insist that any spending bill sent to President Obama’s desk explicitly prohibit taxpayer dollars from going to the women’s health organization, which has come under fire over undercover videos that purportedly show its officials discussing the sale of fetal tissue. Democrats have rallied around Planned Parenthood, and an effort to ax its approximately $500 million in annual funding is likely to fall short, either by running into a filibuster in the Senate or a presidential veto.
The NBC show isn’t casting its net wide enough when it comes to finding new players.
Since the departure of many of its biggest stars two years ago, Saturday Night Live has mostly avoided major cast changes. Yesterday, NBC announced the show would add only one new cast member for its 41st season—the near-unknown stand-up comic Jon Rudnitsky. SNL is, of course, a sketch-comedy show, but it keeps hiring mostly white stand-ups who have a markedly different skill set, with limited results. As critics and viewers keep calling out for greater diversity on the show, it’s hard to imagine the series’s reasoning in sticking to old habits.
As is unfortunately typical today, controversy has already arisen over some tasteless old jokes from Rudnitsky’s Twitter and Vine feeds, similar to the furore that greeted Trevor Noah’s hiring at The Daily Show this summer. But Rudnitsky was apparently hired on the back of his stand-up performances, not his Internet presence, similar to the other young stand-ups the show has hired in recent years: Pete Davidson, Brooks Wheelan (since fired), and Michael Che. It’s a peculiar route to the show, because SNL is 90 percent sketch acting, and unless you’re hosting Weekend Update (like Che), you’re not going to do a lot of stand-up material. So why hire Rudnitsky?