Even if toppling Qaddafi made sense on its own terms, the Western campaign will make it far harder to do any good for Syria.
Hillary Clinton speaks to reporters at the United Nations during a Security Council meeting in Syria / AP
The intervention in Libya -- often touted by advocates as a sterling example of how to intervene responsibly in a civil conflict to prevent atrocity -- has largely fallen off the world's radar. Libya is often cited as a case supporting a possible intervention to prevent further atrocity in Syria, but the two are very different, and the comparison ignores what's happened in Libya since Qaddafi's fall.
The intervention in Libya is far from an assured success. Last week fighting broke out in Tripoli between rival militias bickering over a stretch of beach. Some of the many guns Qaddafi mustered to defend his regime have now found their way to Tuareg rebels in Mali, who are busy fomenting another insurgency there. And last month Medicins Sans Frontieres withdrew some of its staff after witnessing act of torture by some of the revolutionaries that the West had supported in ousting the old regime.
Intervention, in other words, has lots of consequences, and often they're quite bad. While it's relatively easy to talk about the problems the intervention has unleashed on Libya itself, even less remarked upon are the broader political consequences of the Libyan campaign. Russia and China, in particular, have openly said they're angry over how the intervention played out, and it should be no surprise to see them block future moves for intervention.
A big reason for Russia and China's intransigence is the NATO coalition that led the intervention, which badly overstepped the range of permissible actions stipulated in the UN Security Council Resolution that authorized intervention. Russia was an early critic of such actions as France's weapons shipments to the rebels -- criticism that could have been accounted for (Moscow never made any secret of its concerns) but which seemed to be ignored in the rush to intervene. President Obama made a rapid transition from saying "regime change is not on the table" last March (part of the bargain to get Russian abstention from the UNSC vote) to publicly calling for his ouster. France and the UK used similar language, ignoring the politics of getting UN approval for intervention.
Now, when there is another escalating crisis in Syria -- Bashar al-Assad's unjustifiable mass-murder of protesters -- Russia and China have stepped in to veto further UNSC action. This was an entirely predictable response, as both Russia and China were openly scornful of the misleading statements made by interventionists in NATO and the Arab League to get support for Libya.
The veto has led some analysts to say the UNSC is losing relevance, but it seems to me that the opposite might actually be true: the politics of the UNSC should matter as much for launching an intervention as the merits of actually attacking the target country. There is no doubt that what is unfolding in Syria is an atrocity that must end. Sadly, the Libya intervention itself, while a precedent for the idea of global action against a humanitarian threat, is also a very real reason that the world will have a tougher time doing anything for Syria.
Walter Russell Mead wrote an excellent exegesis of the entirety of Russia's calculations on the veto, taking special note of Russian domestic politics and their obsession with their own diminishment in international bodies like the UN. Put simply: Russia expected some consideration in the Libyan campaign, but instead the relevant players are actively working against Russian interests there, even post-Gaddafi. Moscow could not risk the same thing happening to its many interests in Syria.
Even if it were not an election year in Russia, where Putin has just been reminded that he does not enjoy uncritical love from his people, it's likely Russia would have vetoed Syria because of Libya. But there are additional, bigger politics to consider as well.
Many states, none of whom are free, worry that the West's renewed love of intervention might one day be focused upon them. This is a critical consequence of rejecting sovereignty and declaring governments unfit to rule through a mixture of expediency and opportunity. Powerful states with poor human rights records -- Russia and China included -- look at what happened in Libya and see disaster, not freedom. And they are taking steps to avoid it.
In a broader sense, too, the renewed focus on intervention, especially considering what happened in Libya, could have pernicious consequences. Qaddafi famously gave up his nuclear weapons program in 2003. That he was later overthrown right after the U.S. re-established diplomatic ties with Triploi isn't broadly seen as a victory for diplomacy and denuclearization, but rather a textbook case of why nuclear weapons are fantastic invasion insurance. That may be one reason (among many others) why Iran seems so unwilling to contemplate abandoning its own nuclear weapons program -- it believes that nuclear weapons will prevent a capricious and unpredictable West from invading or intervening in its internal affairs.
In a vacuum, intervening to prevent mass killings in Libya made sense. Libya, however, did not (and does not) exist in a vacuum. It has both internal and regional politics. So does Syria. The failure to gain international buy-in to do something -- not necessarily militarily but some response -- to the atrocities there is a direct consequence of interventionists ignoring politics in their rush to do good. Unfortunately, the people of Syria are now paying the price, and will continue to do so.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.
The long-running cartoon’s representation of Judaism was one of the first on television.
Growing up in south London, and then in the largely Catholic town of Manhasset on Long Island, I didn’t encounter many families who looked, sounded, or behaved like mine. In England, my experiences were limited to either my mother’s family, who were all Orthodox Jews, strictly observing the Sabbath and keeping kosher, and to the families of my classmates, who were invariably all gentiles. In Manhasset, I didn’t even have the Orthodox to relate to. So one of my main comforts in both places came from the Pickles family, who—with its big-haired, neurotic, doting mother and its old-world, Yiddish-mumbling grandparents—instantly made me feel at home. It also helped that I could spend time with the Pickles family whenever I wanted; after all, they were on TV.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
...isn't something that can be done on campus. It's an internship.
When I was 17, if you asked me how I planned on getting a job in the future, I think I would have said: Get into the right college. When I was 18, if you asked me the same question, I would have said: Get into the right classes. When I was 19: Get good grades.
But when employers recently named the most important elements in hiring a recent graduate, college reputation, GPA, and courses finished at the bottom of the list. At the top, according to the Chronicle of Higher Education, were experiences outside of academics: Internships, jobs, volunteering, and extracurriculars.
What Employers Want
"When employers do hire from college, the evidence suggests that academic skills are not their primary concern," says Peter Cappelli, a Wharton professor and the author of a new paper on job skills. "Work experience is the crucial attribute that employers want even for students who have yet to work full-time."
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Two scholars discuss the ups and downs of life as a right-leaning professor.
“I don’t think I can say it too strongly, but literally it just changed my life,” said a scholar, about reading the work of Ayn Rand. “It was like this awakening for me.”
Different versions of this comment appear throughout Jon A. Shields and Joshua M. Dunn Sr.’s book on conservative professors, Passing on the Right, usually about people like Milton Friedman and John Stuart Mill and Friedrich Hayek. The scholars they interviewed speak in a dreamy way about these nerdy celebrities, perhaps imagining an alternate academic universe—one where social scientists can be freely conservative.
The assumption that most college campuses lean left is so widespread in American culture that it has almost become a caricature: intellectuals in thick-rimmed glasses preaching Marxism on idyllic grassy quads; students protesting minor infractions against political correctness; raging professors trying to prove that God is, in fact, dead. Studies about professors’ political beliefs and voting behavior suggest this assumption is at least somewhat correct. But Shields and Dunn set out to investigate a more nuanced question: For the minority of professors who are cultural and political conservatives, what’s life actually like?
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
After the successful Allied invasions of western France, Germany gathered reserve forces and launched a massive counter-offensive in the Ardennes, which collapsed by January. At the same time, Soviet forces were closing in from the east, invading Poland and East Prussia. By March, Western Allied forces were crossing the Rhine River, capturing hundreds of thousands of troops from Germany's Army Group B. The Red Army had meanwhile entered Austria, and both fronts quickly approached Berlin. Strategic bombing campaigns by Allied aircraft were pounding German territory, sometimes destroying entire cities in a night. In the first several months of 1945, Germany put up a fierce defense, but rapidly lost territory, ran out of supplies, and exhausted its options. In April, Allied forces pushed through the German defensive line in Italy. East met West on the River Elbe on April 25, 1945, when Soviet and American troops met near Torgau, Germany. Then came the end of the Third Reich, as the Soviets took Berlin, Adolf Hitler committed suicide on April 30, and Germany surrendered unconditionally on all fronts on May 8 (May 7 on the Western Front). Hitler's planned "Thousand-Year Reich" lasted only 12 incredibly destructive years. (This entry is Part 17 of a weekly
Why thyroid diseases are so common—and still so mysterious
When I first suspected I was suffering from hypothyroidism, I did what any anxious, Internet-connected person would do and Googled "dysfunctional thyroid symptoms," and, in another tab, "hypothyroid thinning hair??" for good measure.
What came up sounded like someone describing me for an intimately detailed police sketch:
heightened sensitivity to cold
unexplained weight gain
a pale, puffy face ("Finally, a medical explanation for this," I thought.)
This, combined with the fact that a close family member had recently been diagnosed with a thyroid disorder, sent me scurrying to the nearest endocrinologist's office. They took a blood test, and two weeks later the results came back. Sure enough, the doctor said solemnly, I had hypothyroidism, which meant my thyroid was under-active. She would be starting me on thyroid medication. She couldn't know for sure, but I might have to take drugs for the rest of my life.