A year after NATO intervention, Gallup finds a Libyan approval rating for U.S. leadership far above Mideast and even European norms.
A Libyan rebel holds out the U.S. flag flying from his truck. (Reuters)
About a year and half after the U.S. and several European militaries began bombing Libya as part of the ultimately successfully campaign to aid rebels there and topple Muammar Qaddafi, who was killed last October, Gallup has polled Libyan opinions and found something very unusual: some people in the Middle East seem to actually like America.
According to the just-out poll, 54 percent of Libyans say they hold a favorable view of U.S. leadership. That's really high for the Middle East. How high? The poll suggests that Libyan views are about on par with Australians (who, at 56 percent, have a slightly more favorable view), Israelis (55 percent), and Canadians (at 53 percent, slightly less). That's good company.
The U.S. leadership appears to be more popular in Libya than in many of the European nations it joined with against Qaddafi. The U.S. approval rating is lower in France and Spain (42 percent in both), Sweden (35 percent), and slightly lower in Italy (50 percent). But it is higher in the U.K. and the Netherlands, at 67 and 65 percent respectfully. Gallup doesn't have data for Norway, which also participated. The European average, it says, is 42 percent.
It's hard to know whether Libyans' newfound appreciation for the U.S. will last, or if that approval rating will return to its pre-revolutionary 30 percent. Many complicated factors can effect public opinion, but its notable that one of the downward pressures on U.S. favorability common to the Middle East -- living under an oppressive dictator who either vilifies or is perceived as a puppet of the United States -- is now gone from Libya. But another, perceived U.S. sponsorship of Israel and thus its unpopular policies toward Palestinians, remains.
One promising datapoint is that, for some Libyans, the revolutionary embrace of America has lasted at least a year so far. Last August, the Los Angeles Timesreported that young Libyans, inspired by the U.S. role in the intervention and its food aid, were sporting American flags and professing their love of American ideals as they saw them. "That's why I fly the flag -- to support American-style freedoms that we all want here," explained a 57-year-old Libyan man named Omar al-Keish.
The lesson here is probably a simple one: people like it when a foreign power helps them oust a despised dictator. But that's also an important lesson not to over-learn; Iraqis report a 29 percent approval rating for U.S. leadership and 56 percent disapproval, one of the world's highest.
In 12 of 16 past cases in which a rising power has confronted a ruling power, the result has been bloodshed.
When Barack Obama meets this week with Xi Jinping during the Chinese president’s first state visit to America, one item probably won’t be on their agenda: the possibility that the United States and China could find themselves at war in the next decade. In policy circles, this appears as unlikely as it would be unwise.
And yet 100 years on, World War I offers a sobering reminder of man’s capacity for folly. When we say that war is “inconceivable,” is this a statement about what is possible in the world—or only about what our limited minds can conceive? In 1914, few could imagine slaughter on a scale that demanded a new category: world war. When war ended four years later, Europe lay in ruins: the kaiser gone, the Austro-Hungarian Empire dissolved, the Russian tsar overthrown by the Bolsheviks, France bled for a generation, and England shorn of its youth and treasure. A millennium in which Europe had been the political center of the world came to a crashing halt.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
A child psychologist argues punishment is a waste of time when trying to eliminate problem behavior. Try this instead.
Say you have a problem child. If it’s a toddler, maybe he smacks his siblings. Or she refuses to put on her shoes as the clock ticks down to your morning meeting at work. If it’s a teenager, maybe he peppers you with obscenities during your all-too-frequent arguments. The answer is to punish them, right?
Not so, says Alan Kazdin, director of the Yale Parenting Center. Punishment might make you feel better, but it won’t change the kid’s behavior. Instead, he advocates for a radical technique in which parents positively reinforce the behavior they do want to see until the negative behavior eventually goes away.
As I was reporting my recent series about child abuse, I came to realize that parents fall roughly into three categories. There’s a small number who seem intuitively to do everything perfectly: Moms and dads with chore charts that actually work and snack-sized bags of organic baby carrots at the ready. There’s an even smaller number who are horrifically abusive to their kids. But the biggest chunk by far are parents in the middle. They’re far from abusive, but they aren’t super-parents, either. They’re busy and stressed, so they’re too lenient one day and too harsh the next. They have outdated or no knowledge of child psychology, and they’re scrambling to figure it all out.
Life in Ohio's proud but economically abandoned small towns
Just over a decade ago, Matt Eich started photographing rural Ohio. Largely inhabited by what is now known as the “Forgotten Class” of white, blue-collar workers, Eich found himself drawn to the proud but economically abandoned small towns of Appalachia. Thanks to grants from the Economic Hardship Reporting Project and Getty Images, Eich was able to capture the family life, drug abuse, poverty, and listlessness of these communities. “Long before Trump was a player on the political scene, long before he was a Republican, these people existed and these problems existed,” Eich said. His new book, Carry Me Ohio, published by Sturm and Drang, is a collection of these images and the first of four books he plans to publish as part of The Invisible Yoke, a photographic meditation on the American condition. Even with a deep knowledge of the region, Eich was unprepared for the fury and energy that surrounded the election this year. “The anger is overpowering,” he said. “I knew what was going on, and I’m still surprised. I should have listened to the pictures.”
Universities themselves may be contributing to burnout.
With half of all doctoral students leaving graduate school without finishing, something significant and overwhelming must be happening for at least some of them during the process of obtaining that degree. Mental illness is often offered as the standard rationale to explain why some graduate students burn out. Some research has suggested a link between intelligence and conditions such as bipolar disorder, leading some observers to believe many graduate students struggle with mental-health problems that predispose them to burning out.
But such research is debatable, and surely not every student who drops out has a history of mental illness. So, what compels students to abandon their path to a Ph.D.? Could there be other underlying factors, perhaps environmental, that can cause an otherwise-mentally-healthy graduate student to become anxious, depressed, suicidal, or, in rare cases, violent?
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
Switching pastas and breads is a small decision that could save lives.
Multigrain is a genius approach to selling both white bread and righteousness. The term crept under the umbrella of health quietly. It wasn’t clear why, exactly. (The grain part? Or the multi?) At least it wasn’t white bread, right?
As many eaters of bread came to understand that white bread is a nutritional equivalent of Pixy Stix—the nutritious, fibrous shell of the wheat having been removed, leaving us with only the inner starch, which our bodies almost instantly turn into sugar—it needed some rebranding.
Multigrain is now often used to imply wholesomeness, a virtue to which it often has no claim. Containing the flour of multiple grains does not mean containing whole grains. When millers leave the grain intact before milling, this is whole grain flour. It contains fiber, appeasing the pancreas and microbes that demand it for optimal performance. So, the term to look for is 100 percent whole wheat. (Or wholegrain, though the grain is usually wheat.)
Comedy-drama series like Fleabag and Transparent show how vulnerability is as important as unlikeability and strength when it comes to portraying fictional women.
In the first episode of the HBO series Enlightened, the show’s heroine, Amy Jellicoe, learns that she’s been fired. She does not take the news well. Within minutes, she goes from pitiable victim, sobbing abjectly in a bathroom stall, to mascara-streaked fury. “Go back to your sad, fucking, little desk,” she sneers at her assistant before tracking her ex-lover and presumed betrayer to the office lobby. “I will destroy you—I will bury you—I will kill you, motherfucker!” she screams at him through the elevator doors that she somehow, in a feat of desperation, manages to pry open.
Though the scene aired five years ago, it’s still a pretty radical few minutes of television, and not just because of the ferocity of Laura Dern’s performance. What feels most striking is the series’ willingness to dramatize an extended scene of female distressfor something other than a moralizing end. In this sense, Enlightened anticipates the Amazon series, Fleabag, which evinces a similar empathy toward a female character in the grip of powerfully negative emotions: anger, sadness, grief, self-doubt, shame. It’s probably no accident the two shows have almost identical promotional stills—close-ups of their protagonist’s makeup-smudged faces, staring directly to camera. Like a number of other female-centric, female-created tragicomedies to have emerged on TV in recent years—Transparent, Girls, Catastrophe, Insecure—the series also share a commitment to more compassionate portrayals of dysfunctional heroines, by suspending judgment even (or especially) when they’re at their worst.
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
A century ago, millions of Americans banded together in defense of white, Christian America and traditional morality—and most of their compatriots turned a blind eye to the Ku Klux Klan.
On August 8, 1925, more than 50,000 members of the Ku Klux Klan paraded through Washington, D.C. Some walked in lines as wide as 20 abreast, while others created formations of the letter K or a Christian cross. A few rode on horseback. Many held American flags. Men and women alike, the marchers carried banners emblazoned with the names of their home states or local chapters, and their procession lasted for more than three hours down a Pennsylvania Avenue lined with spectators. National leaders of the organization were resplendent in colorful satin robes and the rank and file wore white, their regalia adorned with a circular red patch containing a cross with a drop of blood at its center.
Nearly all of the marchers wore pointed hoods, but their faces were clearly visible. In part, that was because officials would sanction the parade only if participants agreed to walk unmasked. But a mask was not really necessary, as most members of the Klan saw little reason to hide their faces. After all, there were millions of them in the United States.