The vast majority of people who avoid gluten don’t have celiac disease or even a gluten sensitivity, but as reader Rachel can attest, there’s a big upside to the proliferation of all the GF products and menus fueled by the fad (even as Hamblin noted the downsides):
I found out 10 years ago this month that I had Celiac. I was having horrible stomach pain, reflux, ulcers, etc, and at 19 I had zero quality of life. My biopsy came back positive for Celiac but my blood-work was negative, so my doctors weren’t sure at the time how to diagnose me.
Going gluten-free 10 years ago was one of the most overwhelming and terrifying things I had ever experienced. My doctor flat told me I could continue to eat gluten but I would most likely develop colon cancer by the time I was 40.
I was living in Nashville, where everything was fried, I had no family around me, and nothing was labeled on food items. I remember crying in the grocery store because I had no idea what to buy. I thought, “Am I ever going to be able to eat a sandwich again??” I ate corn tortillas, hummus, eggs, and cheese for an entire month until I found some resources on Celiac.
As there has been a lot more awareness of Celiac over the years and even a cool factor to being gluten free, I have found it much easier to live this way without getting sick. I have traveled around the world and all over the U.S. and it’s been almost a non-issue with many places. I’m grateful for the awareness.
(Also, as a helpful hint, if I get gluten in my meal, I’ve found that sipping Apple Cider Vinegar in water helps alleviate the symptoms. I’m note a doctor, but it helps tremendously.)
On the flip side, I tend to get many disparaging looks when I ask for a gluten free menu, if something has gluten in it, or when I tell people I’m not able to eat it. In fact, I'm more likely to not tell someone and either go hungry or try to figure out an alternative option because of the negative responses.
I know that Celiac is genetic, and though I don’t have children right now, I worry about if they will inherit the gene and whether or not I should start them on a gluten-free diet as babies. I guess I’ll just have to take it one day at a time, but all I know is that I’ll still be gluten free even when it’s not a cool thing to do.
You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
When Michaeleen Doucleff met parents from around the world, she encountered millennia-old methods of raising good kids that made American parenting seem bizarre and ineffective.
At one point in her new book, the NPR journalist Michaeleen Doucleff suggests that parents consider throwing out most of the toys they’ve bought for their kids. It’s an extreme piece of advice, but the way Doucleff frames it, it seems entirely sensible: “Kids spent two hundred thousand years without these items,” she writes.
Doucleff arrives at this conclusion while traveling, with her then-3-year-old daughter, to meet and learn from parents in a Maya village on the Yucatán Peninsula in Mexico; in an Inuit town in a northern Canadian territory; and in a community of hunter-gatherers in Tanzania. During her outings, she witnesses well-adjusted, drama-free kids share generously with their siblings and do chores without being asked.
Adored guru and reviled provocateur, he dropped out of sight. Now the irresistible ordeal of modern cultural celebrity has brought him back.
This article was published online on March 2, 2021.
One day in early 2020, Jordan B. Peterson rose from the dead. The Canadian academic, then 57, had been placed in a nine-day coma by doctors in a Russian clinic, after becoming addicted to benzodiazepines, a class of drug that includes Xanax and Valium. The coma kept him unconscious as his body went through the terrible effects of withdrawal; he awoke strapped to the bed, having tried to rip out the catheters in his arms and leave the intensive-care unit.
When the story of his detox became public, in February 2020, it provided an answer to a mystery: Whatever happened to Jordan Peterson? In the three years before he disappeared from view in the summer of 2019, this formerly obscure psychology professor’s name had been a constant presence in op-ed columns, internet forums, and culture-war arguments. His book 12 Rules for Life: An Antidote to Chaos, published in 2018, sold millions of copies, and he had conducted a 160-city speaking tour, drawing crowds of up to 3,000 a night; premium tickets included the chance to be photographed with him. For $90, his website offered an online course to better understand your “unique personality.” An “official merchandise store” sold Peterson paraphernalia: mugs, stickers, posters, phone cases, tote bags. He had created an entirely new model of the public intellectual, halfway between Marcus Aurelius and Martha Stewart.
A new study of the city’s program that sent cash to struggling individuals finds dramatic changes.
Two years ago, the city of Stockton, California, did something remarkable: It brought back welfare.
Using donated funds, the industrial city on the edge of the Bay Area tech economy launched a small demonstration program, sending payments of $500 a month to 125 randomly selected individuals living in neighborhoods with average incomes lower than the city median of $46,000 a year. The recipients were allowed to spend the money however they saw fit, and they were not obligated to complete any drug tests, interviews, means or asset tests, or work requirements. They just got the money, no strings attached.
These kinds of cash transfers are a common, highly effective method of poverty alleviation used all over the world, in low-income and high-income countries, in rural areas and cities, and particularly for households with children. But not in the United States. The U.S. spends less of its GDP on what are known as “family benefits” than any other country in the Organization for Economic Cooperation and Development, save Turkey. The Temporary Assistance for Needy Families (TANF) program spends less than one-fifth of its budget on direct cash aid, and its funding has been stuck at the same dollar amount since 1996—when the Clinton administration teamed up with congressional Republicans to turn it into a compulsory-work program. Those changes sliced into the safety net, allowing millions of people to fall through.
Why was the New York governor’s reckoning so long in coming?
Updated at 12:00 p.m. ET on March 3, 2021.
Cable-news shows treated Andrew Cuomo like a living legend this summer, thanks to his supposedly superlative handling of the coronavirus pandemic, yet his past few weeks really have been the stuff of myth.
But which myth? Is he Icarus, flying too close to the sun in his premature attempt to claim credit for New York’s public-health prowess, only to have his wings melted by the heat of scandal? Is he Oedipus, brought low by his determination to eclipse his father? Or is he simply Zeus, a powerful man prone to wrathful outbursts and sexual misconduct?
The New York governor finds himself in a perilous position right now, though it is not yet clear how perilous. Cuomo’s COVID-19 approach no longer looks quite so good. Compared with other states, New York hasn’t obviously outperformed, and if not all of that is precisely Cuomo’s fault, it does make his decision to publish a book claiming credit back in October seem unwise. Worse are revelations about the number of deaths in New York nursing homes, especially after a top aide privately acknowledged that the administration had covered up the toll.
When the polio vaccine was declared safe and effective, the news was met with jubilant celebration. Church bells rang across the nation, and factories blew their whistles. “Polio routed!” newspaper headlines exclaimed. “An historic victory,” “monumental,” “sensational,” newscasters declared. People erupted with joy across the United States. Some danced in the streets; others wept. Kids were sent home from school to celebrate.
One might have expected the initial approval of the coronavirus vaccines to spark similar jubilation—especially after a brutal pandemic year. But that didn’t happen. Instead, the steady drumbeat of good news about the vaccines has been met with a chorus of relentless pessimism.
If the party doesn’t pass new protections, it could lose the House, Senate, and White House within the next four years.
The most explosive battle in decades over access to the voting booth will reach a new crescendo this week, as Republican-controlled states advance an array of measures to restrict the ballot, and the U.S. House of Representatives votes on the federal legislation that represents Democrats’ best chance to stop them.
It’s no exaggeration to say that future Americans could view the resolution of this struggle as a turning point in the history of U.S. democracy. The outcome could not only shape the balance of power between the parties, but determine whether that democracy grows more inclusive or exclusionary. To many civil-rights advocates and democracy scholars I’ve spoken with, this new wave of state-level bills constitutes the greatest assault on Americans’ right to vote since the Jim Crow era’s barriers to the ballot.
The GOP has become, in form if not in content, the Communist Party of the Soviet Union of the late 1970s.
We are living in a time of bad metaphors. Everything is fascism, or socialism; Hitler’s Germany, or Stalin’s Soviet Union. Republicans, especially, want their followers to believe that America is on the verge of a dramatic time, a moment of great conflict such as 1968—or perhaps, even worse, 1860. (The drama is the point, of course. No one ever says, “We’re living through 1955.”)
Ironically, the GOP is indeed replicating another political party in another time, but not as the heroes they imagine themselves to be. The Republican Party has become, in form if not in content, the Communist Party of the Soviet Union of the late 1970s.
I can already hear the howls about invidious comparisons. I do not mean that modern American Republicans are communists. Rather, I mean that the Republicans have entered their own kind of end-stage Bolshevism, as members of a party that is now exhausted by its failures, cynical about its own ideology, authoritarian by reflex, controlled as a personality cult by a failing old man, and looking for new adventures to rejuvenate its fortunes.
Focus on prioritization and process, not the assignment itself.
So much of the homework advice parents are given is theory-based, and therefore not entirely helpful in the chaos of day-to-day life. People are told that students should have “grit.” They should “learn from failure.” But it’s hard to know how to implement these ideas when what you really need is to support a kid who has a chemistry test and two papers due in the next 48 hours but seems to be focused only on Instagram.
Some parents manage to guide their kids through these moments with relative ease. Others hire tutors. The large majority of us, however, are stuck at home alone, trying to stave off our own breakdowns in the face of our children’s.
While reprimanding your child for not having started her homework earlier may be your natural instinct, in the midst of stress, it will only make her shut down or lash out. In our experience as teachers, tutors, and parents, the students who feel terrible about procrastinating are more likely to have anxiety and negative feelings that will only fuel their continued procrastination. So instead of admonishing your procrastinator, take a deep breath and try to figure out how she’s going to manage the tasks at hand. Help her make a realistic plan to manage her time. Try to model understanding, even when you’re upset.
Colonizing the red planet is a ridiculous way to help humanity.
There’s no place like home—unless you’re Elon Musk. A prototype of SpaceX’s Starship, which may someday send humans to Mars, is, according to Musk, likely to launch soon, possibly within the coming days. But what motivates Musk? Why bother with Mars? A video clip from an interview Musk gave in 2019 seems to sum up Musk’s vision—and everything that’s wrong with it.
In the video, Musk is seen reading a passage from Carl Sagan’s book Pale Blue Dot. The book, published in 1994, was Sagan’s response to the famous image of Earth as a tiny speck of light floating in a sunbeam—a shot he’d begged NASA to have the Voyager 1 spacecraft take in 1990 as it sailed into space, 3.7 billion miles from Earth. Sagan believed that if we had a photo of ourselves from this distance, it would forever alter our perspective of our place in the cosmos.
Crouching Tiger, Hidden Dragon. Slumdog Millionaire. Parasite. And now, Minari. For years, Asian performers have been overlooked for awards, even when they star in critically acclaimed films.
Only when he began editing Minari did the writer-director Lee Isaac Chung see exactly how much his cast had done for the story. The film, about a Korean American family starting a farm in 1980s Arkansas, was inspired by his childhood, but Chung told his actors he didn’t want them imitating anyone he knew. So instead, they brought their own interpretations to the characters and made Chung’s tale theirs, too. “It’s easy when you have these actors, and every take is good,” he told me over Zoom last month, chuckling. “You have nothing bad to work with.”
Yes, Chung is overflowing with praise for his cast, whom he thanked in his acceptance speech after Minari won a Golden Globe for best foreign-language film on Sunday. But he’s concerned that one actor isn’t seeing enough appreciation: Yeri Han, who plays Monica, the anxious wife of Steven Yeun’s idealistic Jacob. “In the editing room, she was the one who we were always centering our emotional story around,” Chung said of Han. “It’s her face, it’s her looks, and the way she picks at a bedspread because she’s upset. These little, subtle things that we knew: ‘This is making the film what it is.’” He paused. “And unfortunately, it’s invisible.”