William Langewiesche, “The Million-Dollar Nose”; Carl Elliott, “A New Way to Be Mad”; Barbara Ferry and Debbie Nathan, “Mistaken Identity? The Case of New Mexico's 'Hidden Jews'”; Stephen Budiansky, “The Physics of Gridlock”; and much more.
The phenomenon is not as rare as one might think: healthy people deliberately setting out to rid themselves of one or more of their limbs, with or without a surgeon's help. Why do pathologies sometimes arise as if from nowhere? Can the mere description of a condition make it contagious?
With his stubborn disregard for the hierarchy of wines, Robert Parker, the straight-talking American wine critic, is revolutionizing the industry -- and teaching the French wine establishment some lessons it would rather not learn.
Imagine descendants of Jews pursued by the Spanish Inquisition, still tending the dying embers of their faith among peasant Latinos in the American Southwest. The story has obvious resonance, and it has garnered considerable publicity. The truth of the matter may turn out to be vastly different, and nearly as improbable.
In 12 of 16 past cases in which a rising power has confronted a ruling power, the result has been bloodshed.
When Barack Obama meets this week with Xi Jinping during the Chinese president’s first state visit to America, one item probably won’t be on their agenda: the possibility that the United States and China could find themselves at war in the next decade. In policy circles, this appears as unlikely as it would be unwise.
And yet 100 years on, World War I offers a sobering reminder of man’s capacity for folly. When we say that war is “inconceivable,” is this a statement about what is possible in the world—or only about what our limited minds can conceive? In 1914, few could imagine slaughter on a scale that demanded a new category: world war. When war ended four years later, Europe lay in ruins: the kaiser gone, the Austro-Hungarian Empire dissolved, the Russian tsar overthrown by the Bolsheviks, France bled for a generation, and England shorn of its youth and treasure. A millennium in which Europe had been the political center of the world came to a crashing halt.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
A child psychologist argues punishment is a waste of time when trying to eliminate problem behavior. Try this instead.
Say you have a problem child. If it’s a toddler, maybe he smacks his siblings. Or she refuses to put on her shoes as the clock ticks down to your morning meeting at work. If it’s a teenager, maybe he peppers you with obscenities during your all-too-frequent arguments. The answer is to punish them, right?
Not so, says Alan Kazdin, director of the Yale Parenting Center. Punishment might make you feel better, but it won’t change the kid’s behavior. Instead, he advocates for a radical technique in which parents positively reinforce the behavior they do want to see until the negative behavior eventually goes away.
As I was reporting my recent series about child abuse, I came to realize that parents fall roughly into three categories. There’s a small number who seem intuitively to do everything perfectly: Moms and dads with chore charts that actually work and snack-sized bags of organic baby carrots at the ready. There’s an even smaller number who are horrifically abusive to their kids. But the biggest chunk by far are parents in the middle. They’re far from abusive, but they aren’t super-parents, either. They’re busy and stressed, so they’re too lenient one day and too harsh the next. They have outdated or no knowledge of child psychology, and they’re scrambling to figure it all out.
Life in Ohio's proud but economically abandoned small towns
Just over a decade ago, Matt Eich started photographing rural Ohio. Largely inhabited by what is now known as the “Forgotten Class” of white, blue-collar workers, Eich found himself drawn to the proud but economically abandoned small towns of Appalachia. Thanks to grants from the Economic Hardship Reporting Project and Getty Images, Eich was able to capture the family life, drug abuse, poverty, and listlessness of these communities. “Long before Trump was a player on the political scene, long before he was a Republican, these people existed and these problems existed,” Eich said. His new book, Carry Me Ohio, published by Sturm and Drang, is a collection of these images and the first of four books he plans to publish as part of The Invisible Yoke, a photographic meditation on the American condition. Even with a deep knowledge of the region, Eich was unprepared for the fury and energy that surrounded the election this year. “The anger is overpowering,” he said. “I knew what was going on, and I’m still surprised. I should have listened to the pictures.”
Universities themselves may be contributing to burnout.
With half of all doctoral students leaving graduate school without finishing, something significant and overwhelming must be happening for at least some of them during the process of obtaining that degree. Mental illness is often offered as the standard rationale to explain why some graduate students burn out. Some research has suggested a link between intelligence and conditions such as bipolar disorder, leading some observers to believe many graduate students struggle with mental-health problems that predispose them to burning out.
But such research is debatable, and surely not every student who drops out has a history of mental illness. So, what compels students to abandon their path to a Ph.D.? Could there be other underlying factors, perhaps environmental, that can cause an otherwise-mentally-healthy graduate student to become anxious, depressed, suicidal, or, in rare cases, violent?
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
Comedy-drama series like Fleabag and Transparent show how vulnerability is as important as unlikeability and strength when it comes to portraying fictional women.
In the first episode of the HBO series Enlightened, the show’s heroine, Amy Jellicoe, learns that she’s been fired. She does not take the news well. Within minutes, she goes from pitiable victim, sobbing abjectly in a bathroom stall, to mascara-streaked fury. “Go back to your sad, fucking, little desk,” she sneers at her assistant before tracking her ex-lover and presumed betrayer to the office lobby. “I will destroy you—I will bury you—I will kill you, motherfucker!” she screams at him through the elevator doors that she somehow, in a feat of desperation, manages to pry open.
Though the scene aired five years ago, it’s still a pretty radical few minutes of television, and not just because of the ferocity of Laura Dern’s performance. What feels most striking is the series’ willingness to dramatize an extended scene of female distressfor something other than a moralizing end. In this sense, Enlightened anticipates the Amazon series, Fleabag, which evinces a similar empathy toward a female character in the grip of powerfully negative emotions: anger, sadness, grief, self-doubt, shame. It’s probably no accident the two shows have almost identical promotional stills—close-ups of their protagonist’s makeup-smudged faces, staring directly to camera. Like a number of other female-centric, female-created tragicomedies to have emerged on TV in recent years—Transparent, Girls, Catastrophe, Insecure—the series also share a commitment to more compassionate portrayals of dysfunctional heroines, by suspending judgment even (or especially) when they’re at their worst.
Switching pastas and breads is a small decision that could save lives.
Multigrain is a genius approach to selling both white bread and righteousness. The term crept under the umbrella of health quietly. It wasn’t clear why, exactly. (The grain part? Or the multi?) At least it wasn’t white bread, right?
As many eaters of bread came to understand that white bread is a nutritional equivalent of Pixy Stix—the nutritious, fibrous shell of the wheat having been removed, leaving us with only the inner starch, which our bodies almost instantly turn into sugar—it needed some rebranding.
Multigrain is now often used to imply wholesomeness, a virtue to which it often has no claim. Containing the flour of multiple grains does not mean containing whole grains. When millers leave the grain intact before milling, this is whole grain flour. It contains fiber, appeasing the pancreas and microbes that demand it for optimal performance. So, the term to look for is 100 percent whole wheat. (Or wholegrain, though the grain is usually wheat.)
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.