The vast majority of people who avoid gluten don’t have celiac disease or even a gluten sensitivity, but as reader Rachel can attest, there’s a big upside to the proliferation of all the GF products and menus fueled by the fad (even as Hamblin noted the downsides):
I found out 10 years ago this month that I had Celiac. I was having horrible stomach pain, reflux, ulcers, etc, and at 19 I had zero quality of life. My biopsy came back positive for Celiac but my blood-work was negative, so my doctors weren’t sure at the time how to diagnose me.
Going gluten-free 10 years ago was one of the most overwhelming and terrifying things I had ever experienced. My doctor flat told me I could continue to eat gluten but I would most likely develop colon cancer by the time I was 40.
I was living in Nashville, where everything was fried, I had no family around me, and nothing was labeled on food items. I remember crying in the grocery store because I had no idea what to buy. I thought, “Am I ever going to be able to eat a sandwich again??” I ate corn tortillas, hummus, eggs, and cheese for an entire month until I found some resources on Celiac.
As there has been a lot more awareness of Celiac over the years and even a cool factor to being gluten free, I have found it much easier to live this way without getting sick. I have traveled around the world and all over the U.S. and it’s been almost a non-issue with many places. I’m grateful for the awareness.
(Also, as a helpful hint, if I get gluten in my meal, I’ve found that sipping Apple Cider Vinegar in water helps alleviate the symptoms. I’m note a doctor, but it helps tremendously.)
On the flip side, I tend to get many disparaging looks when I ask for a gluten free menu, if something has gluten in it, or when I tell people I’m not able to eat it. In fact, I'm more likely to not tell someone and either go hungry or try to figure out an alternative option because of the negative responses.
I know that Celiac is genetic, and though I don’t have children right now, I worry about if they will inherit the gene and whether or not I should start them on a gluten-free diet as babies. I guess I’ll just have to take it one day at a time, but all I know is that I’ll still be gluten free even when it’s not a cool thing to do.
You asked, so here’s my gluten-free story (safe for Celiacs to read):
I’m not a Celiac, but I do have Crohn’s disease, an inflammatory autoimmune disorder which often causes similar symptoms in the digestive tract. When I was first diagnosed with Crohn’s, a course of steroids followed by immunosuppressive drugs was enough to keep me in relatively good health.
Slowly, though, my symptoms returned. After two years, I was again underweight and anemic (a six-foot-tall male in my twenties, I weighed about 130 pounds at my lightest), with chronic, debilitating stomach pains and other symptoms which made my life very hard.
Friends who hadn’t seen me in months asked about my health as soon as they laid eyes on me. On more than one occasion, I experienced stomach cramps so severe I vomited until there was nothing left but bile. Occasionally, upon standing up too quickly, my vision would fade and my head would spin until I fell to the ground or found something to hold onto. These weren’t the kind of symptoms that can be alleviated through the placebo effect.
People suggested going gluten-free, but I resisted it until I was desperate for many of the reasons laid out in James Hamblin’s piece (much of which I still agree with). But it worked. The pain receded. My digestion improved. I gained 30 pounds, leaving me thin, but not skeletally so.
I asked my gastroenterologist about this, and they suggested I pursue a low-FODMAP diet, which restricts foods like wheat which contain sugars that ferment during digestion. It kept the worst of the symptoms at bay and, along with my medicine, kept my inflammation at a low level. Eventually, even that low level of inflammation caused enough complications that I was put on more powerful medicine, but I've never again been as sick as I was.
In the end, it wasn’t the gluten that bothered me; it was the wheat itself. I found I could drink gluten-free beer, for example, but only the kind that was made from sorghum or other wheat substitutes. Wheat beer with the gluten removed still made me sick, and trace amounts of gluten never bothered me at all. But despite the fact that I wasn’t a Celiac, the availability of gluten-free products was a huge boon for me.
I appreciate what you, James, and TheAtlantic are trying to do by educating the public on these issues. There’s so much pseudoscience surrounding this topic that I’m sometimes embarrassed to admit that I prefer to avoid wheat. But to suggest, by omission or otherwise, that Celiacs are the only people who can benefit from the explosion of gluten-free products ignores the clinical and day-to-day experiences of a great number of people, and I think that’s worth mentioning.
This reader’s on the same page:
Credible sources place the percent of Americans with celiac as high as 1-in-35. But that understates the problem by ignoring people who are allergic to wheat but do not have celiac.
Since I was young my fingers swell (not subtly) when I eat wheat products, and it seems to be more likely to happen with products that are known to be high in gluten (like pizza). Yet I test negative for celiac.
Is the test imperfect? Am I allergic to wheat? I’ve no idea, but it is not a trivial matter. I’m afraid we are in another of those moments when experts think they know it all, while there is much more to be learned.
A reader in Bend, Oregon, is far from gluten-free but nevertheless provides some good, er, food for thought:
Some people have commented that the increased gluten sensitivity in recent decades is due to modern, hybrid wheat varieties, high processing, added gluten, and/or a move away from traditional bread dough fermentation. Michael Pollan’s view was summarized in The Huffington Post piece “Michael Pollan Wants You To Eat Gluten”:
Pollan goes on to say that some people would do well to experiment with fermentation. More specifically, he thinks fermented sourdough is a smart alternative for a healthy gut. Fermented foods in general have been found to be beneficial for gut health, but sourdough bread has a more specific benefit, according to Pollan.
“[The] tradition of fermenting flour with sourdough breaks down the peptides in gluten that give people trouble,” he said. “Anecdotally, I’ve heard from lots of people that when they eat properly fermented bread, they can tolerate it.”
There is some emerging research to support Pollan’s perspective: A 2008 study fed subjects with gluten intolerances either sourdough or regular bread. Similarly, a very small 2012 study fed sourdough to participants with celiac, finding few to no physical side effects.
There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten.
Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten. . . . Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise.
I’m lucky; I can eat plenty of gluten and stay extremely healthy. I even eat seitan sometimes, which is pure wheat gluten. Yum.
[Avoiding gluten] has not been shown (in placebo-controlled studies) to benefit people who do not have the disease. Celiac disease is known to affect about one percent of people. Yet in a global survey of 30,000 people last year, fully 21 percent said that “gluten free” was a “very important” characteristic in their food choices. Among Millennials, the number is closer to one in three. The tendency to “avoid gluten” persists across socioeconomic strata, in households earning more than $75,000 just the same as those earning less than $30,000, and almost evenly among educational attainment. The most common justification for doing so: “no reason.”
He goes on to detail the downsides of gluten-free replica products. A reader responds with a solid bit of advice:
As someone who has had a lifelong gluten allergy (and gave it to two of my three kids), the increased “trendiness” is a mixed bag. Yes, it mean more choices, but it also means that people think my disease is just a trendy lifestyle choice and not a real thing. My general recommendation is not to use too many wheat substitutes. Instead of a gluten-free sandwich, have a salad or meat and veg. Instead of beer, have wine or hard liquor.
One of my part-time jobs right out of college, while interning and waiting tables, was doing research for a book that my roommate and his celiac-suffering business partner were putting together to help people travel and dine out gluten free. This was late 2004, and I had never heard of gluten, nor had any peers I talked to about the research gig. So over the past decade it’s been remarkable to see how rapidly and widespread “gluten free” has become. Now my best friend is GF, for dermatological reasons, as is my mother, who swears that her GF diet has snuffed out some mild health problems—and she’s been a nurse for 40 years, so she’s very science- and health-oriented. Here’s another gluten-free reader who works in the sciences:
I work in human research. Getting people to keep accurate records of what they eat, or to maintain a specific diet for a long enough time without keeping them in a lab environment 24/7 is incredibly difficult if not impossible. I am gluten-free due to promising science on Hashimoto’s thyroiditis (I am not celiac). If you have a problem linked to inflammation, it makes sense to see if going gluten-free can reduce that inflammation.
In the future, as more and more studies are done, they may find the culprit is other than gluten, or that gluten without other natural enzymes in food that’s more alive or containing more of the plants original components might be healthier.
As an N of 1, if I eat gluten now, I get depressed the next day. I don’t seem to have any other negative symptoms as others with celiac do. There is no other “cure” for Hashimotos, but I tend to have fewer symptoms of Hashimotos (lethargy, weight gain, skin issues) when I remain gluten free. I went off it for a while, started eating gluten again, and gained 20 lbs. But this might be also attributable to the fact that more foods were available (i.e. a whole pan of brownies).
I am realistic and yet still making the best choice for myself. I can understand if others are concerned it’s a harmful fad, but there also might actually be something to it, and so I don’t think it should be readily dismissed either.
Neither does this reader:
I recall Nobel prize winner Dr. Barry Marshall commenting that half of what is taught in academic gastroenterology is flat wrong. [CB note: I couldn’t quickly find that quote, but here’s a Kathryn Schulz interview with Marshall about how he was right about ulcers when everyone else was wrong.] So it was not surprising to find see solid research in the last few weeks showing that common reflux medications, proton pump inhibitors, pushed so hard by gastroenterologists, are strongly linked to dementia and cardiac dysfunction. [CB: Here’s a recent report along those lines.]
I have been gluten free for a dozen years. I am not celiac, don’t even have the DNA for it. Prior to going gluten free, which was against gastroenterologists advice, I suffered from chronic severe reflux and GI problems daily and was becoming overweight. Within months after going strictly gluten free, every trace of reflux and GI distress disappeared and over 12 years have never returned. Within six months of going gluten free, I lost the 35 excess pounds I was carrying and have stayed at my ideal weight ever since.
My toughest problem in going gluten free was weaning myself off the proton pump inhibitors that GI docs had pushed on me. What they failed to tell me is that if you start these meds and go off them, you get rebound hyper-acidity at double your pre-med levels, and that lasts a couple of months. Great for pharma marketers. I used an Internet protocol from Jacob Teitelbaum MD to wean myself off PPIs in a couple of months. Never a hint of reflux since, in a dozen years.
I wonder what is motivating the recent quasi-academic push back against gluten-free living? So many such as myself have found gluten-free living to resolve a host of problems even though not celiac. I wonder if the financial interests involved are pushing back. But then I recall Hanlon's razor: “Never attribute to malice that which is explained by incompetence.”
I know through experience that GF people love to talk about going gluten free, so if you’d like to sound off on the subject, drop us an email. Update from a reader with some quick advice:
To those out there (like myself) who are gluten free to decrease inflammation, I caution you about the risk of added sugar in products labeled as GF. What has helped me is to not eat processed GF foods as much as possible and focus on fruits, veggies, nuts, good fats and protein. It is not easy because I often feel deprived. Hence my new focus on detoxing myself off the sugar as much as I can without adding another feeling of deprivation. Sigh.
The surprisingly short life of new electronic devices
Updated on March 22 at 9:06 p.m. ET.
Two years ago, Desmond Hughes heard so many of his favorite podcasters extolling AirPods, Apple’s tiny, futuristic $170 wireless headphones, that he decided they were worth the splurge. He quickly became a convert.
Hughes is still listening to podcasters talk about their AirPods, but now they’re complaining. The battery can no longer hold a charge, they say, rendering them functionally useless. Apple bloggers agree: “AirPods are starting to show their age for early adopters,” Zac Hall, an editor at 9to5Mac, wrote in a post in January, detailing how he frequently hears a low-battery warning in his AirPods now. Earlier this month, Apple Insider tested a pair of AirPods purchased in 2016 against a pair from 2018, and found that the older pair died after two hours and 16 minutes. “That’s less than half the stated battery life for a new pair,” the writer William Gallagher concluded.
Good news, America. Russia helped install your president. But although he owes his job in large part to that help, the president did not conspire or collude with his helpers. He was the beneficiary of a foreign intelligence operation, but not an active participant in that operation. He received the stolen goods, but he did not conspire with the thieves in advance.
This is what Donald Trump’s administration and its enablers in Congress and the media are already calling exoneration. But it offers no reassurance to Americans who cherish the independence and integrity of their political process.
The question unanswered by the attorney general’s summary of Special Counsel Robert Mueller’s report is: Why? Russian President Vladimir Putin took an extreme risk by interfering in the 2016 election as he did. Had Hillary Clinton won the presidency, the most likely outcome, Russia would have been exposed to fierce retaliation by a powerful adversary. The prize of a Trump presidency must have glittered alluringly indeed to Putin and his associates. Why?
Did they admire Trump’s anti-NATO, anti–European Union, anti-ally, pro–Bashar al-Assad, pro-Putin ideology?
Were they attracted by his contempt for the rule of law and dislike of democracy?
Did they hold compromising information about him, financial or otherwise?
Were there business dealings in the past, present, or future?
Or were they simply attracted by Trump’s general ignorance and incompetence, seeing him as a kind of wrecking ball to be smashed into the U.S. government and U.S. foreign policy?
Many public-spirited people have counted on Mueller to investigate these questions, too, along with the narrowly criminal questions in his assignment. Perhaps he did, perhaps he did not; we will know soon, either way. But those questions have always been the important topics.
The Trump presidency from the start has presented a national-security challenge first, a challenge to U.S. public integrity next. But in this hyper-legalistic society, those vital inquiries got diverted early into a law-enforcement matter. That was always a mistake, as I’ve been arguing for two years.
The unusual situation facing Robert Mueller does not justify a repeal of well-established traditions of confidentiality.
As the nation awaits the Mueller report, a return to first principles is in order. One relevant first principle was dramatically illustrated in the breach during the waning weeks of the 2016 presidential campaign. Then–FBI Director James Comey announced at a press conference that no criminal charges would be brought against Hillary Clinton. Comey didn’t stop there, however. In that press conference, which will continue to live in infamy, Comey sharply criticized the former secretary of state for her ill-considered conduct in housing a server in her private residence, only to receive official and—not infrequently—classified information.
The nation should have risen, as one, in righteous indignation in the aftermath of the Comey press conference. In a single misadventure, Comey both seized power that was not his—the power to seek an indictment, a prerogative that was entrusted to the attorney general—and then violated one of the fundamental principles of public prosecution: Thou shalt not drag a subject or target of the investigation through the mud via public criticism. Prosecutors either seek an indictment, or remain quiet.
A former Jehovah's Witness is using stolen documents to expose allegations that the religion has kept hidden for decades.
In March 1997, the Watchtower Bible and Tract Society, the nonprofit organization that oversees the Jehovah’s Witnesses, sent a letter to each of its 10,883 U.S. congregations, and to many more congregations worldwide. The organization was concerned about the legal risk posed by possible child molesters within its ranks. The letter laid out instructions on how to deal with a known predator: Write a detailed report answering 12 questions—Was this a onetime occurrence, or did the accused have a history of child molestation? How is the accused viewed within the community? Does anyone else know about the abuse?—and mail it to Watchtower’s headquarters in a special blue envelope. Keep a copy of the report in your congregation’s confidential file, the instructions continued, and do not share it with anyone.
Americans’ dairy consumption is about to get a lot more cultured. An Object Lesson.
Cottage cheese faced a problem: After World War II, batches of the soft, lumpy dairy concoction developed a propensity to take on a rancid odor and a bitter taste. That changed in 1951, when dairy researchers identified the culprits, three bacterial miscreants that produced this “slimy curd defect.” To prevent the condition, researchers advised cheesemakers to keep these bacteria from entering their manufacturing facilities in the first place. Thus ended the scourge.
Despite this and other advances in cottage-cheese production, like texture analyzers, high-powered microscopes, and trained human tasters, cottage cheese has never enjoyed the same popularity as yogurt. That’s because cottage cheese, once revered for its flavor and versatility, has taken a series of gut-punches in the dairy sector: enduring associations with weight loss, inconvenient packaging, and near-total displacement by its cousin, Greek yogurt, to name a few. But stalwart food scientists and artisanal dairy farmers have high hopes for the future of cottage cheese. With yogurt sales on the decline, a golden age of curds might be right around the corner.
After waking up with a searing pain that radiates down to my shoulders, I hunt for the culprit.
My body’s preferred way to remind me that I’m aging is through pain. In recent years, my level of consequence-free drinking has plummeted from “omg liMitLe$s!!” to one and a half standard glasses of Chardonnay. In yoga, I am often forced not to enter the “fullest expression of the pose” and instead to just kind of lie there.
And then there is The Tweak. About once a month—not at any certain time of the month, but roughly 12 times a year—I will wake up feeling like someone French-braided my neck muscles overnight. The pain burns from the base of my skull, down one side of my neck or the other, and onto the adjacent shoulder blade. The Tweak makes it impossible to rotate my head fully to one side or the other for the day. It’s not an athletic injury—I know no sport. It’s also not related to any underlying medical conditions that I know of, though when I talked with experts for this article, they asked me “if I am stressed,” which I took to be a rhetorical question.
Everything lawmakers needed to know about Trump and Russia was in the public record.
No matter what Attorney General William Barr reveals—or doesn’t—about Special Counsel Robert Mueller’s report, everything Congress needed to know about Donald Trump and Russia was already clear.
October 7, 2016, was the near-death experience of the Trump campaign. That Friday afternoon, David Fahrenthold of The Washington Post reported on an Access Hollywood tape in which Trump boasts of grabbing women. The shock battered the campaign. Speaker of the House Paul Ryan declared publicly that he was “sickened” by Trump, canceled a joint appearance with him, and declined to answer whether he still supported the Trump candidacy.
Less than one hour later, WikiLeaks dumped its largest and most damaging trove of hacked emails to and from Democratic operatives. It included two emails sent years before to the future Hillary Clinton campaign chairman John Podesta. The messages criticized the teachings of the Catholic Church on women and sexuality. The Trump campaign instantly seized on them as proof of the Clinton campaign’s supposed anti-Catholic animus—a useful weapon to help erase memories of Trump’s Twitter attacks on the pope earlier in 2016.
Even without a physical state, the Islamic State can still fund its main product: political violence.
BEIRUT—If you’re looking to transfer money here, there’s a chance you will be directed to Abu Shawkat. He works out of a small office in a working-class suburb of the Lebanese capital, but won’t give you its exact location. Instead, he’ll direct you to a nearby alleyway, and whether he shows up depends on whether he likes the look of you.
Abu Shawkat—not his real name—is part of the hawala system, which is often used to transfer cash between places where the banking system has broken down or is too expensive for some to access. If he agrees to do business, you’ll set a password and he will take your cash, then provide you with the contact information of a hawala broker in the city where your money is headed. Anyone who offers that specific password to that particular broker will get the funds. Thus, cash can travel across borders without any inquiry into who is sending or receiving it, or its purpose.
Donald Cline must have thought no one would ever know. Then DNA testing came along.
Updated at 5:23 p.m. ET on March 18, 2019.
The first Facebookmessage arrived when Heather Woock was packing for vacation, in August 2017. It was from a stranger claiming to be her half sibling. She assumed the message was some kind of scam; her parents had never told her she might have siblings. But the message contained one detail that spooked her. The sender mentioned a doctor, Donald Cline. Woock knew that name; her mother had gone to Cline for fertility treatments before she was born. Had this person somehow gotten her mother’s medical history?
Her mom said not to worry. So Woock, who is 33 and lives just outside Indianapolis, flew to the West Coast for her vacation. She got a couple more messages from other supposed half siblings while she was away. Their persistence was strange. But then her phone broke, and she spent the next week and a half outdoors in Seattle and Vancouver, blissfully disconnected.
In an unprecedented era of winner-take-all urbanism, left-behind cities need federal help.
As America’s big “superstar” cities pull away from the rest of the country, the former industrial hubs and rural towns left behind in today’s tech-driven economy are doing whatever they can to compete—and it isn’t always healthy. The contest to host Amazon’s second headquarters epitomized their problem. Desperate for tech cachet and tens of thousands of jobs, cities from Albany to Fresno stepped forward, in many cases by offering subsidies and tax breaks they could barely afford. Even then, Amazon anointed the thriving Washington, D.C., suburbs and (initially) New York City as its winners.
As I and my Brookings Institution colleagues Mark Muro and Bill Galston noted in a recent report, the U.S. economy suffers from a stark geographic divide. America’s largest cities—places such as New York, Seattle, and San Francisco—have accounted for 75 percent of the nation’s employment growth since 2015. Geographic inequality warps our politics. The counties that voted for Donald Trump in 2016 account for barely more than a third of the nation’s GDP.