The Soviets might have landed on two planets to America's one, but the extent of the ultimate U.S. space victory is a sort of metaphor for the Cold War and its resolution.
The Venera 7, left, landed on Venus in 1970. The Venera 13 took this image of the planet in 1982. (Wikimedia)
In the end, when the nuclear warheads were taken off alert and the borders of Europe and Asia redrawn, history recorded the Cold War as a great American victory. It won the arms race and it won Europe; its economic and political models both triumphed; and it won the war of ideology, with democracy displacing communism and totalitarianism across most of the globe. But there's one arena where the Cold War looked a bit closer to a tie: space.
The Soviet Union was the first to put a satellite in space, the first to put a person in space, the first to land a spacecraft on the moon, and the first -- and only -- to land on Venus. The U.S. was the first to put a person the moon, the first to do flybys of Mars, Venus, and Jupiter, and the first -- but not only -- to land on Mars, most recently with today's Curiosity. (The European Space Agency later got into the game by landing a probe on Titan, a moon orbiting Saturn, in 2005 with assistance from a U.S. spacecraft.) I don't know whether or how you can declare a winner from those two records, but one thing is clear: 20 years after the collapse of the Soviet Union and all it stood for, the U.S. has not met the Soviet record on number of planet surfaces visited.
Of course, space exploration isn't about beating the Soviets anymore, so the U.S. would have little to gain by visiting another planet just to say we did. And, when it comes to actual scientific knowledge gained and height of technological achievement, the Soviet edge is as broken and gone as the Berlin Wall. Still, this old, unchanged record is a reminder of the Soviet Union's deep mark on history, and that it wasn't so long ago that space, an area of global American leadership today, was closely contested, another front in the all-consuming Cold War.
The first manmade object to ever soft-land on another planet was the Soviet-made Venera 7. It launched from an Earth-orbit satellite on August 17, 1970, just over a year after Neil Armstrong walked on the moon, and entered the Venusian atmosphere on December 15. The Soviet command received 23 minutes of faint signals, the first data beamed from the surface of another world. In 1975, it landed the more successful Venera 9 and Venera 10, which sent back the first photos. The Venera program returned soil samples and color, panoramic views in 1981 and again in 1985. The U.S. never attempted to land on Venus, but it has sent orbiters, including 1978's Pioneer Venus 1, which dropped three small probes into the atmosphere.
The Soviet Union might have won the race to Venus, but Mars was more contested. In May 1971, as a proxy war in Vietnam raged, the U.S. and Soviet Union hurled five satellites toward the red planet. Mariner 8 and Kosmos 419 fizzled, but on November 13 the American Mariner 9 became the first vessel to enter another planet's orbit. Two weeks later, the Soviet Union's Mars 2 followed into orbit, with the Mars 3 a few days behind. The U.S. satellite took over 100 times as many photos as the two Soviet ships, but Mars 2 and Mars 3 both carried landers. The first crashed; the second achieved the first-ever landing on Mars. But it lasted only 20 seconds, after which its instruments shut down, possibly due to a dust storm.
Both the U.S. and Soviet Union tried a number of Mars landers after that, but the Americans had far more success. In 1974, the Soviet Union had another disappointment with the Mars 6, which landed successfully but sent back bad data due to a computer chip problem, and the Mars 7, which simply missed. The U.S. landed the Viking in 1976, and later upgraded to rovers with the 1997 Sojourner, 2004 Spirit and Opportunity, and 2012 Curiosity. A Soviet vessel never again successfully touched down, despite two 1988 attempts.
In a way, the planetary race can be seen as a metaphor for the Cold War itself. The competition might have been nail-bitingly close at the time, with the Soviet Union taking some historic leaps ahead of the Americans, a few of which are still with us. In the end, though, not only did the U.S. win, but the extent of is victory has surely surpassed even the wildest dreams of either Nixon or Khrushchev.
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
“If the office is going to become a collection of employees not working together, it essentially becomes no different than a coffee shop.”
There’s plenty of research out there on the benefits of remote and flexible work. It’s been shown to lead to increased productivity, and has an undeniable benefit for work-life balance. But what does it do to everyone back at the office?
In a 2013 memo to workers explaining why the company was eliminating policies that allowed remote work, Jackie Reses, Yahoo’s head of human resources,argued that some of the “best decisions and insights come from hallway and cafeteria discussion,” and that actual presence in the office encourages better collaboration and communication.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
What will happen to digital collections of books, movies, and music when the tech giants fall?
When you purchase a movie from Amazon Instant Video, you’re not buying it, exactly. It’s more like renting indefinitely.
This distinction matters if your notion of “buying” is that you pay for something once and then you get to keep that thing for as long as you want. Increasingly, in the world of digital goods, a purchasing transaction isn’t that simple.
There are two key differences between buying media in a physical format versus a digital one. First, there’s the technical aspect: Maintaining long-term access to a file requires a hard copy of it—that means, for example, downloading a film, not just streaming from a third party’s server. The second distinction is a bit more complicated, and it has to do with how the law has shaped digital rights in the past 15 years. It helps to think about the experience of a person giving up CDs and using iTunes for music purchases instead.