You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
A new study suggests that almost half of those hospitalized with COVID-19 have mild or asymptomatic cases.
At least 12,000 Americans have already died from COVID-19 this month, as the country inches through its latest surge in cases. But another worrying statistic is often cited to depict the dangers of this moment: The number of patients hospitalized with COVID-19 in the United States right now is as high as it has been since the beginning of February. It’s even worse in certain places: Some states, including Arkansas and Oregon, recently saw their COVID hospitalizations rise to higher levels than at any prior stage of the pandemic. But how much do those latter figures really tell us?
From the start, COVID hospitalizations have served as a vital metric for tracking the risks posed by the disease. Last winter, this magazine described it as “the most reliable pandemic number,” while Vox quoted the cardiologist Eric Topol as saying that it’s “the best indicator of where we are.” On the one hand, death counts offer finality, but they’re a lagging signal and don’t account for people who suffered from significant illness but survived. Case counts, on the other hand, depend on which and how many people happen to get tested. Presumably, hospitalization numbers provide a more stable and reliable gauge of the pandemic’s true toll, in terms of severe disease. But a new, nationwide study of hospitalization records, released as a preprint today (and not yet formally peer reviewed), suggests that the meaning of this gauge can easily be misinterpreted—and that it has been shifting over time.
In his new film, the 91-year-old actor-director gets back in the saddle.
Clint Eastwood’s first Hollywood swan song was 1992’s Unforgiven, a dark, bitter Western that bade goodbye to the genre that had made him famous. He was 62 at the time, and after some 30-plus years of riding horses on-screen, the actor-director seemed ready to retire from the fictional range. Since Unforgiven, Eastwood has made 23 more films, starring in 10 of them, and many of those projects could also be considered curtain calls. In movies such as Space Cowboys, Blood Work, Gran Torino, and The Mule, he played fading exemplars of a prior generation’s masculine ideal who were struggling to understand their place in a new world. But Eastwood’s latest film, Cry Macho, marks the first time since 1992 that he’s actually gotten back in the saddle.
Vanishingly few people have legitimate reasons to avoid COVID-19 vaccination. Some say their doctors told them not to get vaccinated anyway.
In the battle against vaccine hesitancy, many officials have suggested that people talk with their doctor if they have concerns about getting vaccinated. This advice makes a certain amount of sense. Primary-care physicians are typically the doctors patients trust most, and doctors deeply understand the benefits of vaccines. The American Medical Association has claimed, based on a survey it conducted, that 96 percent of doctors are fully vaccinated.
In recent weeks, though, I’ve heard from several people with an interesting excuse for waiting to get vaccinated: They say their doctors told them not to. Most of the people I spoke with requested anonymity so they could share sensitive health information. Most would also not give me their doctors’ names in order to shield the providers from negative consequences. The doctors whose names I did get did not return my calls or declined to comment for this story, leaving it unclear what they really think about vaccine exemptions. Some of the people I spoke with may simply be vaccine-hesitant and trying to blame a doctor for their own choice to delay or forgo getting a vaccine. But because doctors are a large and relatively diverse group of people, with varied training, credentials, and personal politics, it makes sense that some doctors would have incorrect views about vaccination.
To celebrities, the red carpet of the Met Gala is like an average person’s front lawn: a place for making bold statements. The event, an annual fundraiser for the Metropolitan Museum of Art’s Costume Institute, is made for flaunting ostentatious couture. The dress code is determined by a theme—this year’s was “American Independence,” in honor of a forthcoming exhibition—that can be interpreted however an attendee prefers. Tickets are $35,000 a pop. And for four hours, the invitees—normally the most relevant cultural figures of the year—get to mug for the camera before heading inside. As a red-carpet co-host, the actor Keke Palmer, declared at the top of last night’s show, “You can never go wrong with a message.”
Bullying those who refuse to get their shots won’t work in the long run.
“Your refusal has cost all of us,” President Joe Biden said to unvaccinated people last week, as he announced a new COVID-vaccine mandate for all workers at private companies with more than 100 employees. The vaccinated, he said, are angry and frustrated with the nearly 80 million people who still haven’t received a vaccine, and their patience “is wearing thin.”
He’s not wrong about that. For people who understand that widespread vaccination is our best strategy for beating the pandemic, the 25 percent of Americans who still haven’t received a single shot are a barrier to freedom. Their exasperation is warranted.
But bullying the unvaccinated into getting their shots isn’t going to work in the long run.
The transcendent power of pilgrimage comes from its total lack of thrills.
“How to Build a Life” is a weekly column by Arthur Brooks, tackling questions of meaning and happiness.
Last month, a survey by the travel industry found that a majority of Americans changed their vacation plans this summer because of the continuing coronavirus pandemic. But not everyone canceled their vacations entirely; travel spending has been almost as high this summer as it was in the summer of 2019. Some would-be adventurers simply found ways to do the exotic things they’d planned to do overseas in less exotic places. One of my friends, for instance, went bungee jumping in North Carolina instead of Costa Rica.
For my vacation, I did the opposite: I went with my family to a fairly exotic place to do a distinctly unexotic thing. I went to Spain and took a very quiet 100-mile walk.
SpaceX just launched four private citizens into orbit for a three-day trip.
CAPE CANAVERAL, Fla.—Before liftoff, the moon was the brightest object in the sky, followed by the tiny, shining pinpricks of Venus, Jupiter, and Saturn. Then the rocket rose with a roar, a white-hot needle casting the dark evening in a soft gold. A crew of four sat atop it, strapped inside a small capsule. And none of them—not one—were professional astronauts.
The passengers who launched today are SpaceX’s first-ever private crew. They are Sian Proctor, a geoscience professor and artist; Hayley Arceneaux, a physician assistant and childhood cancer survivor; Chris Sembroski, a data engineer and Iraq War veteran; and Jared Isaacman, the tech businessman who paid for all their seats. Not long ago, they were strangers. Now they are travel buddies and—in the case of Proctor, Arceneaux, and Sembroski—the beneficiaries of a billionaire with the means to make them all spacefarers.
The battles over “virginity testing” and “virginity-restoration surgery” reveal the persistence of dangerous pseudoscience.
In the Middle Ages, a royal bride would be inspected before her wedding night to make sure she was a virgo intacta—a virgin with an intact hymen covering the entrance to her vagina. “The Hymen is a membrane not altogether without blood,” wrote the 17th-century court obstetrician Louise Bourgeois. “In the middle it hath a little hole, through which the menses are voided. This at the first time of copulation is broken, which causes some pain, and gushing forth of some quantity of blood; which is an evident sign of virginity.”
In reality, some girls are born without a hymen, while others tear the membrane long before they have sex, most commonly by exercising or, today, by using tampons. Yet the demand for virginity testing—typically, a gynecological exam in which a doctor looks for the presence of a hymen—has proved surprisingly durable. In 1979, the British government performed one on a 35-year-old Indian woman who had traveled to London to get married, in order “to see whether she was, in fact, a bona fide virgin.” (The Guardian later revealed that immigration officials subjected more than 80 women to such tests from 1976 to 1979.) The Egyptian authorities used the pretext of virginity inspections to assault female protesters during the Arab Spring in 2011, and until July of this year the Indonesian military regularly performed such assessments not only on female recruits, but also on the fiancées of its male soldiers.
Can Eric Schmitt—Missouri’s anti-mandate attorney general—sue his way to the U.S. Senate?
There’s a particular spot in Jefferson City, Missouri, the state capital, where you can walk a few yards and pass through three different sets of masking rules. Struggling against the heavy wooden doors of the state-supreme-court building and stepping through, you leave the zone of the city and county recommendations—mask when you can’t keep distance—and enter a space where masks are required by order of the court. From there, you can peer through a glass door into a government office, a parallel pandemic universe where no one can tell you what to put on your face—and where trying to do so is a form of government overreach and social control.
This is the fiefdom of Eric Schmitt, the Missouri attorney general and Republican U.S. Senate candidate. Schmitt has routinely snagged national headlines throughout the pandemic for his habit of suing people, most recently over masks. He is certainly not the only or best-known state official with bigger political ambitions battling public-health mandates in the name of personal freedom. Florida has Ron DeSantis, Texas has Greg Abbott—both governors wielding executive orders and fueling presidential speculation. Missouri does not have such a governor. Instead it has Schmitt, an ambitious attorney general wielding lawsuits.
The late comic’s best work exemplified his resistance to cheap, easy material, but also his utter unpretentiousness.
Norm Macdonald, the brilliant and lacerating stand-up comedian who died yesterday of cancer, once told one of the best jokes about the disease that I’ve ever heard. “In the old days, they’d go, ‘Hey, that old man died.’ Now they go, ‘Hey, he lost his battle.’ That’s no way to end your life!” he said. “I’m pretty sure if you die, the cancer also dies at exactly the same time. So that, to me, is not a loss; that’s a draw.” True to form, many news stories yesterday referred to Macdonald’s “battle” with the disease over the past nine years. But none mentioned that he fought it to a draw.
Macdonald was the purest kind of stand-up, someone who could sidle up to an issue as dark as cancer and talk about it with disarming frankness and goofy glee. He didn’t tell jokes to shock people or to deliver a polemic, but that didn’t mean he couldn’t be thought-provoking. He could create finely tuned routines that’d knock the house down, but he took just as much delight in eliciting roars of laughter from fellow comics by reading corny one-liners from an old joke book, to the bafflement of the audience at large.