You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
Colonizing the red planet is a ridiculous way to help humanity.
There’s no place like home—unless you’re Elon Musk. A prototype of SpaceX’s Starship, which may someday send humans to Mars, is, according to Musk, likely to launch soon, possibly within the coming days. But what motivates Musk? Why bother with Mars? A video clip from an interview Musk gave in 2019 seems to sum up Musk’s vision—and everything that’s wrong with it.
In the video, Musk is seen reading a passage from Carl Sagan’s book Pale Blue Dot. The book, published in 1994, was Sagan’s response to the famous image of Earth as a tiny speck of light floating in a sunbeam—a shot he’d begged NASA to have the Voyager 1 spacecraft take in 1990 as it sailed into space, 3.7 billion miles from Earth. Sagan believed that if we had a photo of ourselves from this distance, it would forever alter our perspective of our place in the cosmos.
When the polio vaccine was declared safe and effective, the news was met with jubilant celebration. Church bells rang across the nation, and factories blew their whistles. “Polio routed!” newspaper headlines exclaimed. “An historic victory,” “monumental,” “sensational,” newscasters declared. People erupted with joy across the United States. Some danced in the streets; others wept. Kids were sent home from school to celebrate.
One might have expected the initial approval of the coronavirus vaccines to spark similar jubilation—especially after a brutal pandemic year. But that didn’t happen. Instead, the steady drumbeat of good news about the vaccines has been met with a chorus of relentless pessimism.
It’s not just one problem—and we’re going to need a portfolio of approaches to solve it.
Why wouldn’t someone want a COVID-19 vaccine?
Staring at the raw numbers, it doesn’t seem like a hard choice. Thousands of people are dying of COVID-19 every day. Meanwhile, out of the 75,000 people who received a shot in the vaccine trials from Pfizer-BioNTech, Moderna, AstraZeneca, Johnson & Johnson, and Novavax, zero died and none were hospitalized after four weeks. As the United States screams past 500,000 fatalities, the choice between a deadly disease and a shot in the arm might seem like the easiest decision in the world.
Or not. One-third of American adults said this month that they don’t want the vaccine or are undecided about whether they’ll get one. That figure has declined in some polls. But it remains disconcertingly high among Republicans, young people, and certain minority populations. In pockets of vaccine hesitancy, the coronavirus could continue to spread, kill, mutate, and escape. That puts all of us at risk.
The GOP has become, in form if not in content, the Communist Party of the Soviet Union of the late 1970s.
We are living in a time of bad metaphors. Everything is fascism, or socialism; Hitler’s Germany, or Stalin’s Soviet Union. Republicans, especially, want their followers to believe that America is on the verge of a dramatic time, a moment of great conflict such as 1968—or perhaps, even worse, 1860. (The drama is the point, of course. No one ever says, “We’re living through 1955.”)
Ironically, the GOP is indeed replicating another political party in another time, but not as the heroes they imagine themselves to be. The Republican Party has become, in form if not in content, the Communist Party of the Soviet Union of the late 1970s.
I can already hear the howls about invidious comparisons. I do not mean that modern American Republicans are communists. Rather, I mean that the Republicans have entered their own kind of end-stage Bolshevism, as members of a party that is now exhausted by its failures, cynical about its own ideology, authoritarian by reflex, controlled as a personality cult by a failing old man, and looking for new adventures to rejuvenate its fortunes.
Adam Kinzinger says he’ll fight to take his party back from Donald Trump.
adam Kinzinger is a liberated individual—liberated from his party leadership, liberated from the fear of being beaten in a primary, liberated to speak his mind. The 43-year-old representative was one of 10 House Republicans who voted to impeach Donald Trump for inciting the attack on the U.S. Capitol.
“I don’t have a constitutional duty to defend against a guy that is a jerk and maybe says some things I don’t like,” Kinzinger told me, explaining what had pushed him to finally break with the president. “I do when he’s getting ready to destroy democracy—and we saw that culminate on January 6th.”
This was the sort of language a number of Republicans used in the immediate aftermath of the riot. “The president bears responsibility for Wednesday’s attack on Congress by mob rioters,” House Minority Leader Kevin McCarthy said on January 13. But by the end of the month, McCarthy was traveling hat in hand to Mar-a-Lago to meet with Trump.
A global pandemic doesn’t give us cause to treat the aged callously.
Crises can elicit compassion, but they can also evoke callousness. Since the outbreak of the coronavirus pandemic, we’ve witnessed communities coming together (even as they have sometimes been physically forced apart), and we’ve seen individuals engaging in simple acts of kindness to remind the sick and quarantined that they are not forgotten. Yet from some quarters, we’ve also seen a degree of cruelty that is truly staggering.
Earlier today, a friend posted on Facebook about an experience he’d just had on the Upper West Side of Manhattan: “I heard a guy who looked to be in his 20s say that it’s not a big deal cause the elderly are gonna die anyway. Then he and his friend laughed … Maybe I’m lucky that I had awesome grandparents and maybe this guy didn’t but what is wrong with people???” Some have tried to dress up their heartlessness as generational retribution. As someone tweeted at me earlier today, “To be perfectly honest, and this is awful, but to the young, watching as the elderly over and over and over choose their own interests ahead of Climate policy kind of feels like they’re wishing us to a death they won’t have to experience. It’s a sad bit of fair play.”
Side effects are just a sign that protection is kicking in as it should.
At about 2 a.m. on Thursday morning, I woke to find my husband shivering beside me. For hours, he had been tossing in bed, exhausted but unable to sleep, nursing chills, a fever, and an agonizingly sore left arm. His teeth chattered. His forehead was freckled with sweat. And as I lay next to him, cinching blanket after blanket around his arms, I felt an immense sense of relief. All this misery was a sign that the immune cells in his body had been riled up by the second shot of a COVID-19 vaccine, and were well on their way to guarding him from future disease.
Side effects are a natural part of the vaccination process, as my colleague Sarah Zhang has written. Not everyone will experience them. But the two COVID-19 vaccines cleared for emergency use in the United States, made by Pfizer/BioNTech and Moderna, already have reputations for raising the hackles of the immune system: In both companies’clinical trials, at least a third of the volunteers ended up with symptoms such as headaches and fatigue; fevers like my husband’s were less common.
Chloé Zhao’s Oscar contender about one woman’s itinerant life speaks volumes about this country’s myths of self-sufficiency.
Fern (played by Frances McDormand), the hardscrabble hero of Chloé Zhao’s Nomadland, is the kind of resolute, independent protagonist that has dominated American movies since the dawn of the Western genre. She drives around the country in her van, living as self-sufficiently as possible, and carries a flinty affect with people, revealing little about herself and the turmoil that has led to her life on the road. But Fern is not a bullheaded cowboy fighting on the frontier. She’s a newly widowed woman in her early 60s searching for meaningful existence in a nation that’s become hostile to ordinary citizens in need of help.
Zhao’s epic sweep of a movie, which travels the American West from Nevada to South Dakota, is crammed with beautiful photography of some of the country’s most dramatic landscapes. It’s also overflowing with Zhao’s empathetic style of storytelling, and the ensemble largely features nonactors playing themselves, relaying stories of survival on the road in the aftermath of 2008’s Great Recession. As the United States weathers another seismic economic and humanitarian crisis, Zhao’s film offers insightful perspective on how terrifying and tenuous the American dream can be.
We’ll never know for sure how contagious people are after they’re vaccinated, but we do know how they should act.
Every day, more than 1 million American deltoids are being loaded with a vaccine. The ensuing immune response has proved to be extremely effective—essentially perfect—at preventing severe cases of COVID-19. And now, with yet another highly effective vaccine on the verge of approval, that pace should further accelerate in the weeks to come.
This is creating a legion of people who no longer need to fear getting sick, and are desperate to return to “normal” life. Yet the messaging on whether they might still carry and spread the disease—and thus whether it’s really safe for them to resume their unmasked, un-distanced lives—has been oblique. Anthony Fauci said last week on CNN that “it is conceivable, maybe likely,” that vaccinated people can get infected with the coronavirus and then spread it to someone else, and that more will be known about this likelihood “in some time, as we do some follow-up studies.” CDC Director Rochelle Walensky had been no more definitive on Meet the Press a few days before, where she told the host, “We don’t have a lot of data yet to inform exactly the question that you’re asking.”
An uncertain spring, an amazing summer, a cautious fall and winter, and then, finally, relief.
Updated at 10:12 a.m. ET on February 24, 2021.
The end of the coronavirus pandemic is on the horizon at last, but the timeline for actually getting there feels like it shifts daily, with updates about viral variants, vaccine logistics, and other important variables seeming to push back the finish line or scoot it forward. When will we be able to finally live our lives again?
Pandemics are hard to predict accurately, but we have enough information to make some confident guesses. A useful way to think about what’s ahead is to go season by season. In short: Life this spring will not be substantially different from the past year; summer could, miraculously, be close to normal; and next fall and winter could bring either continued improvement or a moderate backslide, followed by a near-certain return to something like pre-pandemic life.