The vast majority of people who avoid gluten don’t have celiac disease or even a gluten sensitivity, but as reader Rachel can attest, there’s a big upside to the proliferation of all the GF products and menus fueled by the fad (even as Hamblin noted the downsides):
I found out 10 years ago this month that I had Celiac. I was having horrible stomach pain, reflux, ulcers, etc, and at 19 I had zero quality of life. My biopsy came back positive for Celiac but my blood-work was negative, so my doctors weren’t sure at the time how to diagnose me.
Going gluten-free 10 years ago was one of the most overwhelming and terrifying things I had ever experienced. My doctor flat told me I could continue to eat gluten but I would most likely develop colon cancer by the time I was 40.
I was living in Nashville, where everything was fried, I had no family around me, and nothing was labeled on food items. I remember crying in the grocery store because I had no idea what to buy. I thought, “Am I ever going to be able to eat a sandwich again??” I ate corn tortillas, hummus, eggs, and cheese for an entire month until I found some resources on Celiac.
As there has been a lot more awareness of Celiac over the years and even a cool factor to being gluten free, I have found it much easier to live this way without getting sick. I have traveled around the world and all over the U.S. and it’s been almost a non-issue with many places. I’m grateful for the awareness.
(Also, as a helpful hint, if I get gluten in my meal, I’ve found that sipping Apple Cider Vinegar in water helps alleviate the symptoms. I’m note a doctor, but it helps tremendously.)
On the flip side, I tend to get many disparaging looks when I ask for a gluten free menu, if something has gluten in it, or when I tell people I’m not able to eat it. In fact, I'm more likely to not tell someone and either go hungry or try to figure out an alternative option because of the negative responses.
I know that Celiac is genetic, and though I don’t have children right now, I worry about if they will inherit the gene and whether or not I should start them on a gluten-free diet as babies. I guess I’ll just have to take it one day at a time, but all I know is that I’ll still be gluten free even when it’s not a cool thing to do.
You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
Getting COVID-19 when you’re vaccinated isn’t the same as getting COVID-19 when you’re unvaccinated.
A new dichotomy has begun dogging the pandemic discourse. With the rise of the über-transmissible Delta variant, experts are saying you’re either going to get vaccinated, or going to get the coronavirus.
For some people—a decent number of us, actually—it’s going to be both.
Post-vaccination infections, or breakthroughs, might occasionally turn symptomatic, but they aren’t shameful or aberrant. They also aren’t proof that the shots are failing. These cases are, on average, gentler and less symptomatic; faster-resolving, with less virus lingering—and, it appears, less likely to pass the pathogen on. The immunity offered by vaccines works in iterations and gradations, not absolutes. It does not make a person completely impervious to infection. It also does not evaporate when a few microbes breach a body’s barriers. A breakthrough, despite what it might seem, does not cause our defenses to crumble or even break; it does not erase the protection that’s already been built. Rather than setting up fragile and penetrable shields, vaccines reinforce the defenses we already have,so that we can encounter the virus safely and potentially build further upon that protection.
Many queer people are reimagining their own boundaries and thinking of this reentry period as a time for sexual self-discovery.
The pandemic has affected our sex lives in many unusual ways, but perhaps none more unusual than this development: The coronavirus has highlighted the possible public-health benefits of glory holes. Sexual positions that make use of walls as physical barriers have long been considered niche. But when the New York City Department of Health recommended them last month as part of a push for safer sex, it tapped into a question that many of us have been asking: How do you seek sexual satisfaction during a global health crisis?
I haven’t had sex in more than a year, mostly because I took COVID-19 very seriously. I disconnected from the public sphere. No one visited my apartment. I disinfected my groceries and covered my apartment’s air vents with trash bags. As a queer person, I could barely register the idea of sex while living alongside a deadly virus that nobody really understood. One study published early in the pandemic showed that 43.5 percent of people reported a decrease in the quality of their sex life. Among study participants, they had fewer sexual encounters with other people, and even masturbated less often.
In the United States, this pandemic could be almost over by now. The reasons it’s still going are pretty clear.
In the United States, this pandemic could’ve been over by now, and certainly would’ve been by Labor Day. If the pace of vaccination through the summer had been anything like the pace in April and May, the country would be nearing herd immunity. With most adults immunized, new and more infectious coronavirus variants would have nowhere to spread. Life could return nearly to normal.
Experts list many reasons for the vaccine slump, but one big reason stands out: vaccine resistance among conservative, evangelical, and rural Americans. Pro-Trump America has decided that vaccine refusal is a statement of identity and a test of loyalty.
In April, people in counties that Joe Biden won in 2020 were two points more likely to be fully vaccinated than people in counties that Donald Trump won: 22.8 percent were fully vaccinated in Biden counties; 20.6 percent were fully vaccinated in Trump counties. By early July, the vaccination gap had widened to almost 12 points: 46.7 percent were fully vaccinated in Biden counties, 35 percent in Trump counties. When pollsters ask about vaccine intentions, they record a 30-point gap: 88 percent of Democrats, but only 54 percent of Republicans, want to be vaccinated as soon as possible. All told, Trump support predicts a state’s vaccine refusal better than average income or education level.
Gather friends and feed them, laugh in the face of calamity, and cut out all the things––people, jobs, body parts––that no longer serve you.
“The only thing a uterus is good for after a certain point is causing pain and killing you. Why are we even talking about this?” Nora jams a fork into her chopped chicken salad, the one she insisted I order as well. “If your doctor says it needs to come out, yank it out.” Nora speaks her mind the way others breathe: an involuntary reflex, not a choice. (Obviously, all dialogue here, including my own, is recorded from the distortion field of memory.)
“But the uterus …” I say, spearing a slice of egg. “It’s so …”
“Yes. Don’t roll your eyes.”
“I’m not rolling my eyes.” She leans in. “I’m trying to get you to face a, well, it’s not even a hard truth. It’s an easy one. Promise me the minute you leave this lunch you’ll pick up the phone and schedule the hysterectomy today. Not tomorrow. Today.”
Simone Biles is the greatest athlete in the world today.
For me, this isn’t a debate. It’s a statement of fact. On Sunday, she won a record seventh United States gymnastics championship, continuing her jaw-dropping winning streak in every all-around competition she’s entered since 2013. The 24-year-old hasn’t lost in eight years. Typical gymnasts her age aren’t beating all their rivals by the big margins that, for Biles, have become routine.
Although Tom Brady won his seventh Super Bowl at age 43, he is no longer in his prime, and other Super Bowl–winning quarterbacks, including Patrick Mahomes and Aaron Rodgers, are arguably more physically talented. Unlike the current greats in other sports, Biles has no peer. Serena Williams is the greatest female tennis player of all time and among the greatest athletes of all time, but her career is winding down, and Naomi Osaka is in position to unseat her as the face of women’s tennis. LeBron James won’t get a chance to defend the NBA title he won with the Los Angeles Lakers last season, because the Phoenix Suns eliminated his team in the first round of this year’s playoffs.
In 1955, just past daybreak, a Chevrolet truck pulled up to an unmarked building. A 14-year-old child was in the back.
This article was published online on July 22, 2021.
The dentist was a few minutes late, so I waited by the barn, listening to a northern mockingbird in the cypress trees. His tires kicked up dust when he turned off Drew Ruleville Road and headed across the bayou toward his house. He got out of his truck still wearing his scrubs and, with a smile, extended his hand: “Jeff Andrews.”
The gravel crunched under his feet as he walked to the barn, which is long and narrow with sliding doors in the middle. Its walls are made of cypress boards, weathered gray, and it overlooks a swimming pool behind a white columned house. Jeff Andrews rolled up the garage door he’d installed.
Our eyes adjusted to the darkness of the barn where Emmett Till was tortured by a group of grown men. Christmas decorations leaned against one wall. Within reach sat a lawn mower and a Johnson 9.9-horsepower outboard motor. Dirt covered the spot where Till was beaten, and where investigators believe he was killed. Andrews thinks he was strung from the ceiling, to make the beating easier. The truth is, nobody knows exactly what happened in the barn, and any evidence is long gone. Andrews pointed to the central rafter.
They’re not all anti-vaxxers, and treating them as such is making things worse.
Last week, CDC Director Rochelle Walensky said that COVID-19 is “becoming a pandemic of the unvaccinated.” President Joe Biden said much the same shortly after. They are technically correct. Even against the fast-spreading Delta variant, the vaccines remain highly effective, and people who haven’t received them are falling sick far more often than those who have. But their vulnerability to COVID-19 is the only thing that unvaccinated people universally share. They are disparate in almost every way that matters, including why they haven’t yet been vaccinated and what it might take to persuade them. “‘The unvaccinated’ are not a monolith of defectors,” Rhea Boyd, a pediatrician and public-health advocate in the San Francisco Bay Area, tweeted on Saturday.
Persistent hype around mRNA vaccine technology is now distracting us from other ways to end the pandemic.
At the end of January, reports that yet another COVID-19 vaccine had succeeded in its clinical trials—this one offering about 70 percent protection—were front-page news in the United States, and occasioned push alerts on millions of phones. But when the Maryland-based biotech firm Novavax announced its latest stunning trial results last week, and an efficacy rate of more than 90 percent even against coronavirus variants, the response from the same media outlets was muted in comparison. The difference, of course, was the timing: With three vaccines already authorized for emergency use by the U.S. Food and Drug Administration, the nation is “awash in other shots” already, as the The New York Times put it.
The once-dynamic state is closing the door on economic opportunity.
Behold California, colossus of the West Coast: the most populous American state; the world’s fifth-largest economy; and arguably the most culturally influential, exporting Google searches and Instagram feeds and iPhones and Teslas and Netflix Originals and kimchi quesadillas. This place inspires awe. If I close my eyes I can see silhouettes of Joshua trees against a desert sunrise; seals playing in La Jolla’s craggy coves of sun-spangled, emerald seawater; fog rolling over the rugged Sonoma County coast at sunset into primeval groves of redwoods that John Steinbeck called “ambassadors from another time.”
This landscape is bejeweled with engineering feats: the California Aqueduct; the Golden Gate Bridge; and the ribbon of Pacific Coast Highway that stretches south of Monterey, clings to the cliffs of Big Sur, and descends the kelp-strewn Central Coast, where William Hearst built his Xanadu on a hillside where his zebras still graze. No dreamscape better inspires dreamers. Millions still immigrate to my beloved home to improve both their prospects and ours.
A newish wave of sophisticated, adult board games have made exploitation part of their game mechanics. A reckoning is coming.
The board game “Puerto Rico” begins after everyone around the table receives a mat printed with the verdant interior of the game’s namesake island. Players are cast as European tycoons who have trekked across the Atlantic at the height of the Age of Exploration. “In 1493 Christopher Columbus discovered the easternmost island of the Great Antilles,” read the back of the game box that once sat on my living-room shelf. “About 50 years later, Puerto Rico began to really blossom.” To win, one must “achieve the greatest prosperity and highest respect.”
In practice, that means the mechanics of “Puerto Rico” are centered around cultivation, exploitation, and plunder. Each turn, a player takes a role—the “settler,” the “builder,” the “trader,” the “craftsman,” the “captain,” and so on—and tries to slowly transform their tropical enclave into a tidy, 16th-century imperial settlement. Perhaps they uproot the wilds and replace them with tobacco pastures or corn acreage, or maybe they outfit the rocky reefs with fishing wharfs and harbors, in order to ship those goods back across the ocean. All of this is possible only with the help of a resource that the game calls “colonists,” —represented by small, brown discs in the game’s first edition, which was published by Rio Grande Games and is available in major retailers—who arrive by ship and are sent by players to work on their plantations.