Even if toppling Qaddafi made sense on its own terms, the Western campaign will make it far harder to do any good for Syria.
Hillary Clinton speaks to reporters at the United Nations during a Security Council meeting in Syria / AP
The intervention in Libya -- often touted by advocates as a sterling example of how to intervene responsibly in a civil conflict to prevent atrocity -- has largely fallen off the world's radar. Libya is often cited as a case supporting a possible intervention to prevent further atrocity in Syria, but the two are very different, and the comparison ignores what's happened in Libya since Qaddafi's fall.
The intervention in Libya is far from an assured success. Last week fighting broke out in Tripoli between rival militias bickering over a stretch of beach. Some of the many guns Qaddafi mustered to defend his regime have now found their way to Tuareg rebels in Mali, who are busy fomenting another insurgency there. And last month Medicins Sans Frontieres withdrew some of its staff after witnessing act of torture by some of the revolutionaries that the West had supported in ousting the old regime.
Intervention, in other words, has lots of consequences, and often they're quite bad. While it's relatively easy to talk about the problems the intervention has unleashed on Libya itself, even less remarked upon are the broader political consequences of the Libyan campaign. Russia and China, in particular, have openly said they're angry over how the intervention played out, and it should be no surprise to see them block future moves for intervention.
A big reason for Russia and China's intransigence is the NATO coalition that led the intervention, which badly overstepped the range of permissible actions stipulated in the UN Security Council Resolution that authorized intervention. Russia was an early critic of such actions as France's weapons shipments to the rebels -- criticism that could have been accounted for (Moscow never made any secret of its concerns) but which seemed to be ignored in the rush to intervene. President Obama made a rapid transition from saying "regime change is not on the table" last March (part of the bargain to get Russian abstention from the UNSC vote) to publicly calling for his ouster. France and the UK used similar language, ignoring the politics of getting UN approval for intervention.
Now, when there is another escalating crisis in Syria -- Bashar al-Assad's unjustifiable mass-murder of protesters -- Russia and China have stepped in to veto further UNSC action. This was an entirely predictable response, as both Russia and China were openly scornful of the misleading statements made by interventionists in NATO and the Arab League to get support for Libya.
The veto has led some analysts to say the UNSC is losing relevance, but it seems to me that the opposite might actually be true: the politics of the UNSC should matter as much for launching an intervention as the merits of actually attacking the target country. There is no doubt that what is unfolding in Syria is an atrocity that must end. Sadly, the Libya intervention itself, while a precedent for the idea of global action against a humanitarian threat, is also a very real reason that the world will have a tougher time doing anything for Syria.
Walter Russell Mead wrote an excellent exegesis of the entirety of Russia's calculations on the veto, taking special note of Russian domestic politics and their obsession with their own diminishment in international bodies like the UN. Put simply: Russia expected some consideration in the Libyan campaign, but instead the relevant players are actively working against Russian interests there, even post-Gaddafi. Moscow could not risk the same thing happening to its many interests in Syria.
Even if it were not an election year in Russia, where Putin has just been reminded that he does not enjoy uncritical love from his people, it's likely Russia would have vetoed Syria because of Libya. But there are additional, bigger politics to consider as well.
Many states, none of whom are free, worry that the West's renewed love of intervention might one day be focused upon them. This is a critical consequence of rejecting sovereignty and declaring governments unfit to rule through a mixture of expediency and opportunity. Powerful states with poor human rights records -- Russia and China included -- look at what happened in Libya and see disaster, not freedom. And they are taking steps to avoid it.
In a broader sense, too, the renewed focus on intervention, especially considering what happened in Libya, could have pernicious consequences. Qaddafi famously gave up his nuclear weapons program in 2003. That he was later overthrown right after the U.S. re-established diplomatic ties with Triploi isn't broadly seen as a victory for diplomacy and denuclearization, but rather a textbook case of why nuclear weapons are fantastic invasion insurance. That may be one reason (among many others) why Iran seems so unwilling to contemplate abandoning its own nuclear weapons program -- it believes that nuclear weapons will prevent a capricious and unpredictable West from invading or intervening in its internal affairs.
In a vacuum, intervening to prevent mass killings in Libya made sense. Libya, however, did not (and does not) exist in a vacuum. It has both internal and regional politics. So does Syria. The failure to gain international buy-in to do something -- not necessarily militarily but some response -- to the atrocities there is a direct consequence of interventionists ignoring politics in their rush to do good. Unfortunately, the people of Syria are now paying the price, and will continue to do so.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
A Brooklyn-based group is arguing that the displacement of longtime residents meets a definition conceived by the United Nations in the aftermath of World War II.
No one will be surprised to learn that the campaign to build a national movement against gentrification is being waged out of an office in Brooklyn, New York.
For years, the borough’s name has been virtually synonymous with gentrification, and on no street in Brooklyn are its effects more evident than on Atlantic Avenue, where, earlier this summer, a local bodega protesting its impending departure in the face of a rent hike, put up sarcastic window signs advertising “Bushwick baked vegan cat food” and “artisanal roach bombs.”
Just down the block from that bodega are the headquarters of Right to the City, a national alliance of community-based organizations that since 2007 has made it its mission to fight “gentrification and the displacement of low-income people of color.” For too long, organizers with the alliance say, people who otherwise profess concern for the poor have tended to view gentrification as a mere annoyance, as though its harmful effects extended no further than the hassles of putting up with pretentious baristas and overpriced lattes. Changing this perception is the first order of business for Right to the City: Gentrification, as these organizers see it, is a human-rights violation.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
But letting customers buy their own would force cable companies to improve their equipment.
One of the least glamorous realities of the American cable industry is a relic invented in 1948: the cable box. The box has become a fixture in the American household, not least because it is surprisingly profitable. Earlier this year, a U.S. Senate study found that American households pay $231 a year on average renting cable boxes. Further, the report estimated that 99 percent of cable customers rented their equipment, and, across the country, that added up to a $19.5 billion industry just renting cable boxes.
The senators who commissioned the study, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, noted that this dependable rental revenue gave the industry little incentive to innovate and make better cable boxes. Which begs a really good question: Why aren’t more people purchasing their cable boxes?
Why haven’t more challengers entered the race to defeat the Iraq War hawk, Patriot Act supporter, and close friend of big finance?
As Hillary Clinton loses ground to Bernie Sanders in Iowa, where her lead shrinks by the day, it’s worth noticing that she has never made particular sense as the Democratic Party’s nominee. She may be more electable than her social-democratic rival from Vermont, but plenty of Democrats are better positioned to represent the center-left coalition. Why have they let the former secretary of state keep them out of the race? If Clinton makes it to the general election, I understand why most Democrats will support her. She shares their views on issues as varied as preserving Obamacare, abortion rights, extending legal status to undocumented workers, strengthening labor unions, and imposing a carbon tax to slow climate change.
Learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic.
I’m not in favor of anyone learning to code unless she really wants to. I believe you should follow your bliss, career-wise, because most of the things you’d buy with all the money you’d make as a programmer won’t make you happy. Also, if your only reason for learning to code is because you want to be a journalist and you think that’s the only way to break into the field, that’s false.
I’m all for people not becoming coders, in other words—as long they make that decision for the right reasons. “I’m bad at math” is not the right reason.
Math has very little to do with coding, especially at the early stages. In fact, I’m not even sure why people conflate the two. (Maybe it has to do with the fact that both fields are male-dominated.)
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
Actually, a good amount: Belittling their plight by comparing it to blue-collar workers’ ignores the trickle-down harms of an exhausting work culture.
Over the past few decades, workers without college degrees have not only seen jobs disappear and wages stagnate—the jobs that remain have, all too often, gotten worse. Constant surveillance is common; schedules are erratic; escalating performance quotas exact faster work. But these trends, often thought to be confined to front-line workers, have creeped up corporate hierarchies, affecting managers and executives. That’s prompted a new controversy: Are white-collar workers victims of exploitation, or merely whining?
A devastating report on the work culture at Amazon’s headquarters recently reignited the debate. The New York Times’s August exposé, based on dozens of interviews, portrayed a firm with all the regimentation and rigidity of military boot camp, minus the esprit de corps. Workers routinely cried at their desks. Rather than being comforted or accommodated, sick employees were dumped into Orwellianly named “Performance Improvement Plans” that simply hastened their eventual departures. Faced with a comprehensive employee-ranking system, cabals of managers agreed to praise one another while talking down the performance of others. Amazon’s “collaborative feedback tool” encouraged a Panopticon of vicious feedback—and similar software may be coming to many more firms.
Conservatives want to defund the group, even if it means shutting down the government. And they’re holding the GOP leadership accountable.
It has become an annual harbinger of autumn in this era of divided government: The calendar swings from August to September, Congress returns from its long summer break, and Republican leaders try to figure out how to keep the federal lights on past the end of the month.
In 2013, John Boehner gave in to Senator Ted Cruz and his conservative allies in the House, and the government shut down for two weeks in a failed fight over Obamacare. A year ago, Boehner and Mitch McConnell succeeded in twice putting off a losing battle over immigration until after they could wrest control of the Senate from the Democrats.
With federal funding set to expire on September 30, conservatives are once again demanding a standoff that Boehner and McConnell are hell-bent on avoiding. This time around, the issue that might prevent an orderly—if temporary—extension of funding is Planned Parenthood. Along with Cruz, House conservatives insist that any spending bill sent to President Obama’s desk explicitly prohibit taxpayer dollars from going to the women’s health organization, which has come under fire over undercover videos that purportedly show its officials discussing the sale of fetal tissue. Democrats have rallied around Planned Parenthood, and an effort to ax its approximately $500 million in annual funding is likely to fall short, either by running into a filibuster in the Senate or a presidential veto.
The NBC show isn’t casting its net wide enough when it comes to finding new players.
Since the departure of many of its biggest stars two years ago, Saturday Night Live has mostly avoided major cast changes. Yesterday, NBC announced the show would add only one new cast member for its 41st season—the near-unknown stand-up comic Jon Rudnitsky. SNL is, of course, a sketch-comedy show, but it keeps hiring mostly white stand-ups who have a markedly different skill set, with limited results. As critics and viewers keep calling out for greater diversity on the show, it’s hard to imagine the series’s reasoning in sticking to old habits.
As is unfortunately typical today, controversy has already arisen over some tasteless old jokes from Rudnitsky’s Twitter and Vine feeds, similar to the furore that greeted Trevor Noah’s hiring at The Daily Show this summer. But Rudnitsky was apparently hired on the back of his stand-up performances, not his Internet presence, similar to the other young stand-ups the show has hired in recent years: Pete Davidson, Brooks Wheelan (since fired), and Michael Che. It’s a peculiar route to the show, because SNL is 90 percent sketch acting, and unless you’re hosting Weekend Update (like Che), you’re not going to do a lot of stand-up material. So why hire Rudnitsky?