Coffee is officially off the vice list as new studies show health benefits for ailments ranging from cancer to Parkinson's disease
I am not, by nature, a morning person. So ever since the age of 20, I have been a proud member of Coffee Achievers of the World--people whose daily intake of Morning Joe is an essential factor in getting the brain and body kick-started in the morning (or mid-afternoon, or before a college all-nighter). Getting a cup of coffee in the morning is such a high priority for me, in fact, that when I climbed a mountain high in the Himalayan mountains, I took a zip-lock bag of Coffee Singles along with me. I could handle yaks, glaciers, and whatever other discomforts the day had to throw at me, as long as I could start it with a steaming hot cup of Java.
As a result, I have also spent the past decades periodically defending my habit to non-coffee-drinking friends and the occasional health-fanatic doctor--because, as we all knew, coffee was bad for you. "Look," I'd tell the critics. "I don't have many vices. So I'm very attached to the few I have."
Well, huzzah and hurrah, all that is changing!
In a study published Tuesday in the Journal of the National Cancer Institute, a group of Harvard researchers announced that they'd found that coffee consumption actually reduces the risk of prostate cancer, and particularly lethal prostate cancer, in men. Not only that, but a Swedish study published last week in Breast Cancer Research indicates that coffee could also help reduce a woman's risk for post-menopausal, ER-negative breast cancer.
All of that is in addition to other recent studies that have found links between coffee consumption and a decreased risk of gallstones, type 2 diabetes, and Parkinson's disease, as well as lower rates of disease progression in liver cancer and cirrhosis. Other recent studies have indicated that coffee may not even increase a person's risk of heart disease or stroke. Turns out that coffee contains antioxidants and compounds that can improve glucose metabolism and insulin secretion. It also seems to have an effect on sex hormones, which is why researchers looked at its impact on prostate and breast cancer.
There are caveats to the results, of course. The strong correlation in the Harvard study came from men who drank six cups of coffee a day, and the Swedish study results applied to women who drank five or more cups of coffee a day. What's more, a German study (the MARIE study) that was used to validate the Swedish research findings did not show a statistically significant link between coffee consumption and a reduced risk of breast cancer--a result the Swedish researchers think may have to do with the fact that Swedish coffee is boiled, while German coffee is filtered. Of course, boiled coffee has also been shown to raise cholesterol levels, so drinking huge amounts of Swedish coffee in an effort to ward off ER-negative breast cancer might not be such a terrific idea. The Swedish paper also notes that the scientific community is still divided in its opinion of the toxicity of coffee.
But still. How has coffee managed to go from a universally agreed-upon vice to at least a potential virtue in such a relatively short period of time? The Harvard researchers suspect that part of the issue is that coffee drinking has traditionally been associated with other high-health-risk habits--e.g. drinking more alcohol, smoking, and not exercising--that muddied the waters of what role the coffee itself was playing.
"The difficulty of being able to separate the effects of coffee on health from the effects of associated behaviors, such as smoking or alcohol use, is one reason that coffee was seen as negative for so long," said Kathryn Wilson, one of the Harvard researchers. "Until there were computers that could handle the necessary statistics, along with studies with larger sample sizes, it was very difficult to control for multiple factors at once to see their individual effects on health outcomes."
The caveats are important, too. As an article in the New York Times Sunday Business section this week pointed out, scientific studies do support Quaker's claim that eating oatmeal can reduce cholesterol ... but only if you eat three or more bowls of it a day. Same with Activa's claims that the probiotics in its yogurt help to stimulate digestion (at least three servings a day). A "healthy" diet trying to hew to the standards of all these studies would be a horrific gorge-feast of multiple pots of coffee and so much oatmeal, yogurt, and other supposedly "healthy" foods that there'd likely be nothing all that healthy, and certainly nothing balanced, about it.
And that's not even taking into account the changing views on what foods are even healthy. Eggs were bad, and then good. The big benefits of soy milk are now suspect, even as coffee is seeing a reprieve. Drinking alcohol is a health risk, but drinking a moderate amount of red wine is good for your heart. On the other hand, a 2002 study by Spanish researchers found that people who drank more than two glasses of wine a day had a dramatically reduced risk of getting a cold. The head could spin, trying to keep up with it all.
Given all of that, I asked the Harvard team what advice they had for the average person, based on their research results.
"I wouldn't recommend that men change their coffee consumption based on this study (or any single study)," Wilson answered. "[But] I think this study is part of mounting evidence that you don't need to feel guilty about your current coffee consumption."
Guiltless coffee. Is it possible? I might have to ponder that over a glass of red wine ... or another cup of steaming Java.
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
As incomes fall across the nation, even better-off areas like Sheboygan County, Wisconsin, are faltering.
SHEBOYGAN, Wisc.—There is still a sizable middle class in this county of 115,000 on the shores of Lake Michigan, a pleasant hour’s drive from Milwaukee. You can see it in the cars that pour in and out of the parking lots of local factories, in the restaurants packed with older couples on weeknights, and in the bars that seem to be on every single corner. You can see it in the local parks, including one called Field of Dreams, where kids play soccer and baseball and their parents sit and watch.
About 63 percent of adults in Sheboygan make between $41,641 and $124,924, meaning the area has one of the highest shares of middle-class households in the country, according to a report from the Pew Research Center. Nationally, only 51 percent of adults are middle-class.
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
The Model S’s Autopilot isn’t technically a driverless feature, but the federal investigation into why a driver using it was killed will still influence the future of driverless vehicles.
Federal officials are investigating a crash that killed the driver of a Model S, a Tesla vehicle with a partially autonomous driving system, in a move that has major implications for the future of driverless vehicles.
“This is the first known fatality in just over 130 million miles where Autopilot was activated …” Tesla wrote in a statement on Thursday. “It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.”
The investigation may be standard procedure, but it’s also certain to influence the ongoing conversation about the safety of self-driving vehicles.
The Model S isn’t technically a driverless car, but Tesla has been a vocal player in the race to bring truly driverless cars to market. The company’s Autopilot feature is an assistive technology, meaning that drivers are instructed to keep their hands on the wheel while using it—even though it is sophisticated enough to complete tasks like merging onto the highway. It wasn’t clear from Tesla’s statement how engaged the driver was at the time of the crash.