Even if toppling Qaddafi made sense on its own terms, the Western campaign will make it far harder to do any good for Syria.
Hillary Clinton speaks to reporters at the United Nations during a Security Council meeting in Syria / AP
The intervention in Libya -- often touted by advocates as a sterling example of how to intervene responsibly in a civil conflict to prevent atrocity -- has largely fallen off the world's radar. Libya is often cited as a case supporting a possible intervention to prevent further atrocity in Syria, but the two are very different, and the comparison ignores what's happened in Libya since Qaddafi's fall.
The intervention in Libya is far from an assured success. Last week fighting broke out in Tripoli between rival militias bickering over a stretch of beach. Some of the many guns Qaddafi mustered to defend his regime have now found their way to Tuareg rebels in Mali, who are busy fomenting another insurgency there. And last month Medicins Sans Frontieres withdrew some of its staff after witnessing act of torture by some of the revolutionaries that the West had supported in ousting the old regime.
Intervention, in other words, has lots of consequences, and often they're quite bad. While it's relatively easy to talk about the problems the intervention has unleashed on Libya itself, even less remarked upon are the broader political consequences of the Libyan campaign. Russia and China, in particular, have openly said they're angry over how the intervention played out, and it should be no surprise to see them block future moves for intervention.
A big reason for Russia and China's intransigence is the NATO coalition that led the intervention, which badly overstepped the range of permissible actions stipulated in the UN Security Council Resolution that authorized intervention. Russia was an early critic of such actions as France's weapons shipments to the rebels -- criticism that could have been accounted for (Moscow never made any secret of its concerns) but which seemed to be ignored in the rush to intervene. President Obama made a rapid transition from saying "regime change is not on the table" last March (part of the bargain to get Russian abstention from the UNSC vote) to publicly calling for his ouster. France and the UK used similar language, ignoring the politics of getting UN approval for intervention.
Now, when there is another escalating crisis in Syria -- Bashar al-Assad's unjustifiable mass-murder of protesters -- Russia and China have stepped in to veto further UNSC action. This was an entirely predictable response, as both Russia and China were openly scornful of the misleading statements made by interventionists in NATO and the Arab League to get support for Libya.
The veto has led some analysts to say the UNSC is losing relevance, but it seems to me that the opposite might actually be true: the politics of the UNSC should matter as much for launching an intervention as the merits of actually attacking the target country. There is no doubt that what is unfolding in Syria is an atrocity that must end. Sadly, the Libya intervention itself, while a precedent for the idea of global action against a humanitarian threat, is also a very real reason that the world will have a tougher time doing anything for Syria.
Walter Russell Mead wrote an excellent exegesis of the entirety of Russia's calculations on the veto, taking special note of Russian domestic politics and their obsession with their own diminishment in international bodies like the UN. Put simply: Russia expected some consideration in the Libyan campaign, but instead the relevant players are actively working against Russian interests there, even post-Gaddafi. Moscow could not risk the same thing happening to its many interests in Syria.
Even if it were not an election year in Russia, where Putin has just been reminded that he does not enjoy uncritical love from his people, it's likely Russia would have vetoed Syria because of Libya. But there are additional, bigger politics to consider as well.
Many states, none of whom are free, worry that the West's renewed love of intervention might one day be focused upon them. This is a critical consequence of rejecting sovereignty and declaring governments unfit to rule through a mixture of expediency and opportunity. Powerful states with poor human rights records -- Russia and China included -- look at what happened in Libya and see disaster, not freedom. And they are taking steps to avoid it.
In a broader sense, too, the renewed focus on intervention, especially considering what happened in Libya, could have pernicious consequences. Qaddafi famously gave up his nuclear weapons program in 2003. That he was later overthrown right after the U.S. re-established diplomatic ties with Triploi isn't broadly seen as a victory for diplomacy and denuclearization, but rather a textbook case of why nuclear weapons are fantastic invasion insurance. That may be one reason (among many others) why Iran seems so unwilling to contemplate abandoning its own nuclear weapons program -- it believes that nuclear weapons will prevent a capricious and unpredictable West from invading or intervening in its internal affairs.
In a vacuum, intervening to prevent mass killings in Libya made sense. Libya, however, did not (and does not) exist in a vacuum. It has both internal and regional politics. So does Syria. The failure to gain international buy-in to do something -- not necessarily militarily but some response -- to the atrocities there is a direct consequence of interventionists ignoring politics in their rush to do good. Unfortunately, the people of Syria are now paying the price, and will continue to do so.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
The results of the referendum are, in theory, not legally binding.
Lest we think the Euroskepticism displayed this week by British voters is new, let me present a scene from the BBC’s Yes, Minister, a comedy about the U.K. civil service’s relationship with a minister. The series ran from 1980 to ’84 (and, yes, it was funny), at a time when the European Union was a mere glint in its founders’ eyes.
The Europe being referred to in the scene is the European Economic Community (EEC), an eventually 12-member bloc established in the mid-1950s, to bring about greater economic integration among its members.
In many ways, the seeds of the U.K.’s Thursday referendum on its membership in the European Union were sown soon after the country joined the now-defunct EEC in 1973. Then, as now, the ruling Conservative Party and opposition Labour, along with the rest of the country, were deeply divided over the issue. In the run-up to the general election the following year, Labour promised in its manifesto to put the U.K.’s EEC membership to a public referendum. Labour eventually came to power and Parliament passed the Referendum Act in 1975, fulfilling that campaign promise. The vote was held on June 5, 1975, and the result was what the political establishment had hoped for: an overwhelming 67 percent of voters supported the country’s EEC membership.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
More than a decade ago, Daniel Suelo closed his bank account and moved into a desert cave. Here's how he eats, sleeps, and evades the law.
More than a decade ago, Daniel Suelo closed his bank account and moved into a desert cave. Here's how he eats, sleeps, and evades the law.
society is designed so that you have to have money," Daniel Suelo says. "You have to
be a part of the capitalist system. It's illegal to live outside of it."
defied these laws. His primary residence is the canyons near Arches National Park,
where he has lived in a dozen caves tucked into sandstone nooks. In the fall of
2002, two years after quitting money, he homesteaded a majestic alcove high on a
cliff, two hundred feet across and fifty feet tall. Sitting inside and gazing
into the gorge below felt like heralding himself to the world from inside the
bell of a trumpet.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.
The U.K.’s vote to leave the European Union betrays a failure of empathy and imagination among its leaders. Will America’s political establishment fare any better?
If there is a regnant consensus among the men and women who steer the Western world, it is this: The globe is flattening. Borders are crumbling. Identities are fluid. Commerce and communications form the warp and woof, weaving nations into the tight fabric of a global economy. People are free to pursue opportunity, enriching their new homes culturally and economically. There may be painful dislocations along the way, but the benefits of globalization heavily outweigh its costs. And those who cannot see this, those who would resist it, those who would undo it—they are ignorant of their own interests, bigoted, xenophobic, and backward.
So entrenched is this consensus that, for decades, in most Western democracies, few mainstream political parties have thought to challenge it. They have left it to the politicians on the margins of the left and the right to give voice to such sentiments—and voicing such sentiments relegated politicians to the margins of political life.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.