You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
Districts should rethink imposing on millions of children an intervention that provides little discernible benefit.
In the panicked spring of 2020, as health officials scrambled to keep communities safe, they recommended various restrictions and interventions, sometimes in the absence of rigorous science supporting them. That was understandable at the time. Now, however, two years into this pandemic, keeping unproven measures in place is no longer justifiable. Although no district is likely to roll back COVID policies in the middle of the Omicron surge, at the top of the list of policies we should rethink once the wave recedes is mandatory masks for kids at school.
The CDC guidance on school masking is far-reaching, recommending “universal indoor masking by all students (age 2 and older), staff, teachers, and visitors to K–12 schools, regardless of vaccination status.” In contrast, many countries—the U.K., Sweden, Norway, Denmark, and others—have not taken the U.S.’s approach, and instead follow World Health Organization guidelines, which recommend against masking children ages 5 and younger, because this age group is at low risk of illness, because masks are not “in the overall interest of the child,” and because many children are unable to wear masks properly. Even for children ages 6 to 11, the WHO does not routinelyrecommend masks, because of the “potential impact of wearing a mask on learning and psychosocial development.” The WHO also explicitly counsels against masking children during physical activities, including running and jumping at the playground, so as not to compromise breathing.
If nominated, Ketanji Brown Jackson wouldn’t necessarily change the Court’s balance. But she would make history.
Like an air-horn blast at summer camp, the news of 83-year-old United States Supreme Court Justice Stephen Breyer’s imminent retirement is calling Democrats to attention. For the first time since 2016, when then-President Barack Obama tried and failed to appoint Merrick Garland to the bench, a Democratic president has the chance to fill an open seat on the Supreme Court, and this time around, he will likely be successful. But who will President Joe Biden choose?
We know that his nominee will almost certainly be a woman. In 2020, then-candidate Biden vowed that he would respond to a Supreme Court opening by nominating a Black woman. Dozens of candidates are being talked about, but nearly all of the Court watchers I interviewed for this story have their money on one in particular: Ketanji Brown Jackson.
Some believe Putin has not only Ukraine, but the whole West, exactly where he wants it. A more balanced consideration is in order.
A terrible thing may be impending in Ukraine. Undoubtedly, subversion, sabotage, and murder await, although such miseries have been going on for some time without the West paying much attention. But a Russian onslaught, to include air and missile strikes followed by an invasion, would be a lot worse. Thousands of people may die, and the foundations of European security would be rocked as they have not been since the early days of the Cold War.
Even so, the degree of public hand-wringing and even despair in the United States is excessive, and not just in comparison with the relative phlegmatism of the Ukrainian population. The commentary on the Russian buildup and threats has taken many forms—pointless quarter-century-old recriminations about NATO expansion, foolish psychotherapeutic diagnoses of Russian President Vladimir Putin’s need for “respect,” assertions that the Biden administration’s weakness created this situation, and, above all, the belief that Putin has not only Ukraine, but the whole West, including the United States, exactly where he wants it. A more balanced consideration is in order.
The Bowlin family knew they had a history of malformations in the brain. But they had no idea how far back it went.
Of the three Bowlin sisters, Margaret, the middle one, was the first to show signs. She began having seizures as a toddler. Then the eldest, Bettina, had a brief and mysterious episode of weakness in her right hand. In 1986, as an adult, she had a two-week migraine that got so bad, she couldn’t hold food in her mouth or money in her right hand. The youngest, Susan, felt fine, but her parents still took her for an exam in 1989, when she was 19. A brain scan found abnormal clusters of blood vessels that, as it turned out, were in her sisters’ brains too. These malformations in the brain can be silent. But they can also leak or, worse, burst without warning, causing the seizures, migraines, and strokelike symptoms Bettina and Margaret experienced. If the bleeding in the brain gets bad enough, it can be deadly.
Since last summer, the conservative campaign against vaccination has claimed thousands of lives for no ethically justifiable purpose.
In the earlyphases of the pandemic, as the coronavirus spread in the United States and doctors and pharmacists and supermarket clerks continued to work and risk infection, some commentators made reference—metaphorical reference, fast and loose and over the top—to ritual human sacrifice. The immediate panicky focus on resuming business as usual in order to keep the stock market from crashing was the equivalent of “those who offered human sacrifices to Moloch,” according to the writer Kitanya Harrison. That first summer, as Republicans settled into their anti-testing, anti-lockdown, anti-mask, nothing-to-worry-about orthodoxy, Representative Jamie Raskin, a Democrat, said it was “like a policy of mass human sacrifice.” The anthropology professor Shan-Estelle Brown and the researcher Zoe Pearson wrote that people who continued to do their jobs outside their homes were essentially victims of “involuntary human sacrifice, made to look voluntary.” Meanwhile, people on the right likewise compared the inconvenience of closing down public places to ritual sacrifice.
Old songs now represent 70 percent of the U.S. music market. Even worse: The new-music market is actually shrinking.
Old songs now represent 70 percent of the U.S. music market, according to the latest numbers from MRC Data, a music-analytics firm. Those who make a living from new music—especially that endangered species known as the working musician—should look at these figures with fear and trembling. But the news gets worse: The new-music market is actually shrinking. All the growth in the market is coming from old songs.
The 200 most popular new tracks now regularly account for less than 5 percent of total streams. That rate was twice as high just three years ago. The mix of songs actually purchased by consumers is even more tilted toward older music. The current list of most-downloaded tracks on iTunes is filled with the names of bands from the previous century, such as Creedence Clearwater Revival and The Police.
America’s tidal wave of Omicron infections seems to have crested, but it’s still in a bad place. Even as coronavirus cases are beginning to tick down nationwide, so many Americans have tested positive for COVID since Christmas Day that they account for a quarter of all cases recorded in the United States. Thanks to the immunity America has—through shots and prior infections—most of these people came out of it largely unscathed. And they got an immunity bump too: We’ve long known that an infection with the coronavirus spurs the body to churn out more antibodies.
While public-health officials are still urging all Americans to be cautious with so much of the virus around, guidance for people who have already recovered from COVID this winter is sorely lacking. That has left Omicron survivors to deal with a confusing question: What now? I reached out to a handful of epidemiologists, and they all agreed that getting Omicron isn’t a golden ticket to normalcy. However, the immune boost from an Omicron infection can still be paired with other precautions to safely go about many activities. Keeping a few pandemic principles in mind can help make everyday decisions a little less fraught.
This was always unsustainable. Now it’s simply impossible.
Last Thursday, a group of 20 mothers in Boston met up outside a local high school. Their goal wasn’t to socialize, drink wine, or even share COVID-related tips. They were there for one reason and one reason only: to stand in a circle—socially distanced, of course—and scream.
“I knew that we all needed to come together and support each other in our rage, resistance and disappointment,” Sarah Harmon, the group’s organizer, wrote on Instagram before the gathering. Ironically, some 20 other moms who had RSVP’d “yes” had to cancel at the last minute because they or other family members had COVID, Harmon told me.
When mothers feel there is no more appealing way to spend an evening than to yell into the frigid January darkness, something is very, very wrong. Parents in the United States are living through a universally terrible moment. For two years, we’ve been spending each and every day navigating an ever-changing virus that’s threatening not only our well-being but our livelihoods. The situation has reached a fever pitch during this wave, when we’re expected to function normally even though nothing is normal and none of the puzzle pieces in front of us fit together.
Two recent works challenge the long-standing pact of American motherhood: We give mothers nothing and expect everything in return.
The moments I felt most viscerally in Maggie Gyllenhaal’s The Lost Daughter, an intermittently dreamy and menacing exploration of maternal ambivalence, weren’t when Leda (played by Olivia Colman) confesses, weeping, that as a young mother she abandoned her children, or when a worm wriggles out of the mouth of the doll that Leda has stolen, as if to literalize the movie’s themes of love and caretaking corrupted. Rather, two other scenes felt jarring to me: one when Leda is sitting in blissful solitude on a beach, and another when she’s at a cinema watching The Last Time I Saw Paris. In both, Leda’s contented absorption is rudely interrupted by loud, thoughtless groups who commandeer her space and disrupt her peace.
The new variant seems to be our quickest one yet. That makes it harder to catch with the tests we have.
It certainly might not seem like it given the pandemic mayhem we’ve had, but the original form of SARS-CoV-2 was a bit of a slowpoke. After infiltrating our bodies, the virus would typically brew forabout five or six daysbefore symptoms kicked in. In the many months since that now-defunct version of the virus emerged, new variants have arrived to speed the timeline up. Estimates for this exposure-to-symptom gap, called the incubation period, clocked in at about five days for Alpha and four days for Delta. Now word has it that the newest kid on the pandemic block, Omicron, may have ratcheted it down to as little asthree.
If that number holds, it’s probably bad news. These trimmed-down cook times are thought to play a major part in helping coronavirus variants spread: In all likelihood, the shorter the incubation period, the faster someone becomes contagious—and the quicker an outbreak spreads. A truncated incubation “makes a virus much, much, much harder to control,” Jennifer Nuzzo, an epidemiologist at the Johns Hopkins Center for Health Security, told me.