You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
The U.S. may end up with the worst COVID-19 outbreak in the industrialized world. This is how it’s going to play out.
Three months ago, no one knew that SARS-CoV-2 existed. Now the virus has spread to almost every country, infecting at least 446,000 people whom we know about, and many more whom we do not. It has crashed economies and broken health-care systems, filled hospitals and emptied public spaces. It has separated people from their workplaces and their friends. It has disrupted modern society on a scale that most living people have never witnessed. Soon, most everyone in the United States will know someone who has been infected. Like World War II or the 9/11 attacks, this pandemic has already imprinted itself upon the nation’s psyche.
A global pandemic of this scale was inevitable. In recent years, hundreds of health experts have written books, white papers, and op-eds warning of the possibility. Bill Gates has been telling anyone who would listen, including the 18 million viewers of his TED Talk. In 2018, I wrote a story for The Atlantic arguing that America was not ready for the pandemic that would eventually come. In October, the Johns Hopkins Center for Health Security war-gamed what might happen if a new coronavirus swept the globe. And then one did. Hypotheticals became reality. “What if?” became “Now what?”
China warned Italy. Italy warned us. We didn’t listen. Now the onus is on the rest of America to listen to New York.
In the emergency-department waiting room, 150 people worry about a fever. Some just want a test, others badly need medical treatment. Those not at the brink of death have to wait six, eight, 10 hours before they can see a doctor. Those admitted to the hospital might wait a full day for a bed.
I am an emergency-medicine doctor who practices in both Manhattan and Queens; at the moment, I’m in Queens. Normally, I love coming to work here, even though in the best of times, my co-residents and I take care of one of New York City’s most vulnerable, underinsured patient populations. Many have underlying illnesses and a language barrier, and lack primary care.
The coronavirus outbreak may last for a year or two, but some elements of pre-pandemic life will likely be won back in the meantime.
The new coronavirus has brought American life to a near standstill, closing businesses, canceling large gatherings, and keeping people at home. All of those people must surely be wondering: When will things return to normal?
The answer is simple, if not exactly satisfying: when enough of the population—possibly 60 or 80 percent of people—is resistant to COVID-19 to stifle the disease’s spread from person to person. That is the end goal, although no one knows exactly how long it will take to get there.
There are two realistic paths to achieving this “population-level immunity.” One is the development of a vaccine. The other is for the disease to work its way through the population, surely killing many, but also leaving many others—those who contract the disease and then recover—immune. “They’re just Teflon at that point,” meaning they can’t get infected again and they won’t pass on the disease, explains Andrew Noymer, a public-health professor at the University of California at Irvine. Once enough people reach Teflon status—though we don’t yet know if recovering from the disease confers any immunity at all, let alone lifelong immunity—normalcy will be restored.
Trump is utterly unsuited to deal with this crisis, either intellectually or temperamentally.
For his entire adult life, and for his entire presidency, Donald Trump has created his own alternate reality, complete with his own alternate set of facts. He has shown himself to be erratic, impulsive, narcissistic, vindictive, cruel, mendacious, and devoid of empathy. None of that is new.
But we’re now entering the most dangerous phase of the Trump presidency. The pain and hardship that the United States is only beginning to experience stem from a crisis that the president is utterly unsuited to deal with, either intellectually or temperamentally. When things were going relatively well, the nation could more easily absorb the costs of Trump’s psychological and moral distortions and disfigurements. But those days are behind us. The coronavirus pandemic has created the conditions that can catalyze a destructive set of responses from an individual with Trump’s characterological defects and disordered personality.
“The thought of simply breathing in and out without coughing and reuniting with my children ... is goal enough. To—literally—live and let live will be enough.”
I can pinpoint the exact moment I started feeling off. My partner, Will, and I were on a bike ride on the afternoon of Wednesday, March 18, to escape our apartment and get some exercise. This was back when leaving a New York City apartment to get some exercise was still okay, or at least that’s what we’d read, or at least that’s what we thought? If the coronavirus pandemic has taught us anything, it’s that what is considered dogma today might change tomorrow.
Ten minutes into our bike ride, I was overcome by an intense fatigue. “I think I have to go back,” I said.
Back home, I felt chilled. Took my temperature: 99.1. I’m normally 97.1, but still, not a huge deal. We’d been so careful about wiping down doorknobs, washing our hands, and keeping everyone except for our family out of our apartment. I’d been ambiently worried enough that my 13-year-old son could be a silent carrier of the virus that I’d yanked him out of his public middle school and off the crowded subways four days before Mayor Bill de Blasio pulled the plug– (far too belatedly, in my opinion). I was getting over a urinary-tract infection, so my fever, I thought, must be from that.
It has taken a good deal longer than it should have, but Americans have now seen the con man behind the curtain.
When, in January 2016, I wrote that despite being a lifelong Republican who worked in the previous three GOP administrations, I would never vote for Donald Trump, even though his administration would align much more with my policy views than a Hillary Clinton presidency would, a lot of my Republican friends were befuddled. How could I not vote for a person who checked far more of my policy boxes than his opponent?
What I explained then, and what I have said many times since, is that Trump is fundamentally unfit—intellectually, morally, temperamentally, and psychologically—for office. For me, that is the paramount consideration in electing a president, in part because at some point it’s reasonable to expect that a president will face an unexpected crisis—and at that point, the president’s judgment and discernment, his character and leadership ability, will really matter.
The government is showing how not to handle a pandemic.
By now, the global timeline of the coronavirus’s development has been well established: The first case reportedly appeared in mid-November; in December, the Chinese government was still attributing hospitalizations to a peculiar form of pneumonia; through January and February, the outbreak began spreading around the world; and its epicenter is today firmly in Europe and the United States.
Throughout, another set of events were occurring here in India. Late last year, Prime Minister Narendra Modi’s Hindu-nationalist government introduced and passed a controversial new law, ostensibly in support of minorities in neighboring countries, that in fact openly discriminated against Muslims and undermined India’s secular foundations. Then, early this year, protests over that new law snowballed into a pogrom in which dozens of people—mostly Muslims—have been killed.
The president can’t simply cancel the fall balloting, but his state-level allies could still deliver him a second term.
Even under a normal president, the coronavirus pandemic would present real challenges to the 2020 American election. Everything about in-person voting could be dangerous. Waiting in line, touching a voting machine, and working in polling stations all run afoul of social-distancing mandates. Already, Maryland, Kentucky, Georgia, and Louisiana have postponed their presidential primaries, while Wyoming, New York, and Ohio have altered their voting procedures. Of course, other democracies face similar problems; the United Kingdom has postponed local elections for one year.
But under President Donald Trump, the possibilities for how the coronavirus could wreak havoc on the election are all the more concerning. This is not a president who cares about the sanctity of the electoral process. After all, he has never seemed particularly concerned about Russia’s efforts to manipulate the 2016 outcome (presumably because they were on his behalf), and he was impeached for demanding Ukrainian help in his reelection efforts.
Unless the country does dramatically more to provide them with the equipment they need to do their job safely, it risks disaster.
The morning before my shift, I try to stay busy with emails, writing, cleaning the house, anything really. If I sit and think about it too long, undisturbed, I get nervous. I’m afraid to go to work, and yet I’m told I must. The flitting anxiety swells as I pull on my scrubs and head to the car. The streets are empty. I drive alone into the epicenter. It peaks when I first step through the door into the jumble of patients in chairs, stretchers, and beds crowded around our cramped workstation, staff jammed together discussing care, writing notes, calling reports. Then I start, surrounded by my colleagues, and am too busy to think about it. The fear is as much for my family and friends as for me. Probably more. I’m a physician who works in an emergency department in Washington, D.C., and the coronavirus is spreading.
The coronavirus is making me experience what Germans poetically call heimweh, the hurt of being far from your native land.
In times of upheaval or natural catastrophe, the State Department often advises Americans to avoid some of the world’s poorest nations. When ISIS took over large parts of Syria and Mali descended into civil war, the federal government implored Americans not to go to those countries. One of the pieces of advice it offers to those who insist on visiting them anyway is rather blunt: “Draft a will.”
These warnings speak to a set of assumptions so obvious, they seem almost silly to spell out. America is a rich and stable country. So long as U.S. citizens stay home—or restrict their travel to other developed nations—they are likely to remain safe. Travel warnings tend to flow from north to south, rich to poor, democracy to dictatorship.