Are grains killing us--or at least, killing our New Years Resolution to lose some weight? Karl Smith looks at this infographic and wonders:
As always, no one disputes that wedging will result in weight loss. That is driving calories-in and calories-out in opposite directions will lower the caloric content of the body. Excluding water there is a rough relationship between caloric content and mass. For most fat, which is our primary concern in obesity, the relation is about 3500 calories per pound.
The tendency of all animals is to try to get calories-in and calories-out to move in harmony. If you have to wedge then that means that this system has failed. At a minimum we would like to know why.
Historically people worked a lot more than they do now. They also ate a lot more than they do now. From the year 1400 to 1970 average calories expended fell dramatically but so did average caloric intake. Obesity was never a severe problem. The system did not fail.
Then from 1970 to 2010 average calories expended actually rose but calories consumed rose more and obesity exploded. The system failed.
If you go and look at the actual graph though, you can see Gary Taubes's thesis on display. This is natural since the dataset used to make the graph is one of Gary's favorites. First you see meat falling, then fat falling/stalling, then sugar falling as various healthy eating theories rose to prominence.
The only thing that rises consistently is grains. Gary insinuates and sometimes outright says that obesity was caused by the encouragement international health authorities for people to eat more grain. The naturally tendency is for people to eat more meat as calories become easier to obtain. He suggests that consciously overriding this mechanism led to an excess of insulin and possibly deficiency of peptide YY, which are key regulators of caloric balance.
I am skeptical of this theory, but it is one that at least recognizes the underlying theoretical problems.
I too am skeptical of this theory, and here's why. It's totally true that if you look at the change in the American diet since the 1960s, grain consumption has gone up dramatically, growing right along with our waistlines. The problem is that this is only true for the 1960s. Check out my sadly less snazzy infographic showing the caloric contribution of various elements to the US food supply since 1910:
As you can see, we have never gotten back up to the nearly 40% of calories from grains that we consumed in the early part of the 20th century.
Now, of course, food supply is not a perfect proxy for food consumption--but it's a pretty good proxy, especially in an era before massive farm subsidies. Was it that all the grain consumed before 1950 was healthier whole grain? No.As flour became an industrial product in the late 19th century, mills began processing out the germ and other "whole wheat" elements because the fats in the germ caused the flour to go rancid. By 1914, your great grandmothers were mostly baking with white flour. Polished ("white") rice was similarly well established, and for some of the same reasons. And of course corn, the other major American grain, does not have a healthier "whole" alternative.
To me, the Taubes theory only works if you start your dataset in the 1970s. If you look earlier, you notice that our sturdy forebears were in fact giant balls of carbohydrates (the grain figure doesn't even include the two hundred pounds of potatoes Americans ate every year in the early 20th century.) Yet they were not fat.
I hear a lot about Taubes' theory from people pushing the notion that "we're evolved to eat meat and fruit, not processed grains". I mean, true as far as it goes--but it doesn't go very far. A ribeye and an arugula salad with oilive oil and vinegar is almost as far from what our paleolithic ancestors ate as pasta primavera and an angel-food cake. The meat our ancestors ate in the wild was not mostly fat-rich steak--game animals don't have that much body fat, and their muscles are a lot less tender. We've selectively bred our domesticated animals for considerably more succulence than our ancestors enjoyed. In the rich world, we've also stopped eating the "gamier", more vitamin-rich organs. In fact, almost every fruit or vegetable you enjoy eating has been bread to be larger, higher-calorie, and full of less in the way of fibers and natural pesticides than what our pre-agricultural ancestors ate.
Yet it's only now that we're getting fat. Which suggests to me that the cause is something other than the variation from our "natural", meat-rich diet.
If the party cares about winning, it needs to learn how to appeal to the white working class.
The strategy was simple. A demographic wave—long-building, still-building—would carry the party to victory, and liberalism to generational advantage. The wave was inevitable, unstoppable. It would not crest for many years, and in the meantime, there would be losses—losses in the midterms and in special elections; in statehouses and in districts and counties and municipalities outside major cities. Losses in places and elections where the white vote was especially strong.
But the presidency could offset these losses. Every four years the wave would swell, receding again thereafter but coming back in the next presidential cycle, higher, higher. The strategy was simple. The presidency was everything.
The quality and variety of food in the U.S. has never been better. The business seems to be struggling. What’s really going on?
For restaurants in America, it is the best of times, and it is the worst of times.
Last century’s dystopians imagined that mediocre fast-food chains would take overevery square inch of the country. But in cities across the U.S., residents are claiming that the local restaurant scene is in a golden age of variety and quality. I’ve heard it in Portland, Oregon, named the best food city in America by the Washington Post; in Washington, D.C., named the best food city in America by Bon Appetit; in New Orleans, where the number of restaurants grew 70 percent after Hurricane Katrina; and in San Francisco, which boasts the most restaurants per capita in the country; and in Chicago, which has added several three-Michelin-star restaurants this decade. I live in New York, which will always lead the country in sheer abundance of dining options, but after years of visiting my sister in Los Angeles, I’m thoroughly convinced that America’s culinary capital has switched coasts.
The South Coast, a 30-mile drive from Palo Alto, is facing an affordable-housing shortage that is jeopardizing its agricultural heritage.
On the drive up the coast from the southernmost part of Northern California’s San Mateo County, Highway 1’s two lanes are surrounded by wind-whipped seas on one side and redwood forests on the other. The landscape is dotted with wild yellow mustard in the spring and pumpkins in the fall. A popular place for day-trippers to picnic, go wine-tasting, and shop at roadside farm stands, the region—affectionately nicknamed “the Slowcoast” for its unhurried pace—is a balm to the busyness nearby in Silicon Valley, to the east, and San Francisco, to the north.
Home to fewer than 3,000 people, the South Coast is the least densely populated part of the Bay Area. While it feels like a region unto itself, it is part of San Mateo County, which is where—just over the Santa Cruz Mountains—several big tech companies, such as Facebook and Oracle, are based. South of those firms’ campuses (in Santa Clara County) are the well-known tech hubs of Mountain View, Cupertino, and Palo Alto. San Mateo County is also the home of some of the wealthiest tech executives: The city of Atherton, about a 30-mile drive from the South Coast, was, according to Forbes, the country’s most expensive zip code in 2015 and the third-most expensive in 2016. The countywide median price for a single-family home reached $1.2 million last year.
The Supreme Court announced Monday it will review the president’s controversial executive order next term. But in the meantime, the administration can enforce some of its provisions.
The U.S. Supreme Court agreed to review a series of lower-court rulings blocking the Trump administration’s controversial travel ban on Monday, setting up a major showdown over presidential power and religious discrimination.
In an unsigned order issued on the Court’s last day before its summer recess, the justices scheduled oral arguments in the case for when they return in October. They also partially lifted the lower courts’ injunctions against Section 2(c) of President Trump’s executive order, which temporarily suspended visa applications from six Muslim-majority countries, as well as Section 6, which froze the U.S. Refugee Admissions Program and halted refugee entry into the United States.
The GOP planned a dynastic restoration in 2016. Instead, it triggered an internal class war. Can the party reconcile the demands of its donors with the interests of its rank and file?
The angriest and most pessimistic people in America aren’t the hipster protesters who flitted in and out of Occupy Wall Street. They aren’t the hashtavists of #BlackLivesMatter. They aren’t the remnants of the American labor movement or the savvy young dreamers who confront politicians with their American accents and un-American legal status.
The angriest and most pessimistic people in America are the people we used to call Middle Americans. Middle-class and middle-aged; not rich and not poor; people who are irked when asked to press 1 for English, and who wonder how white male became an accusation rather than a description.
You can measure their pessimism in polls that ask about their expectations for their lives—and for those of their children. On both counts, whites without a college degree express the bleakest view. You can see the effects of their despair in the new statistics describing horrifying rates of suicide and substance-abuse fatality among this same group, in middle age.
Anthony Fauci, head of the National Institute of Allergy and Infectious Diseases, on persuading anti-vaxers, predicting the next outbreak, and working with Trump.
If you run into a left-leaning “consultant” these days, there’s a fairly good chance they used to work for the Obama administration. Scores of federal officials and bureaucrats have resigned or been fired since President Trump’s inauguration, some after realizing their goals were not in line with the new president’s.
Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, wasn’t one of them. In fact, he seemed surprised at the suggestion that he might do something other than what he’s been doing since he began leading the institute in 1984—trying to protect people from diseases like Ebola, Zika, and HIV.
This is despite the fact that some of Trump’s policy proposals seem to directly contradict his efforts. Trump has proposed cutting funding for a program that provides HIV drugs to people in poor countries by 17 percent. Not long after, six members of the Presidential Advisory Council on HIV/AIDS resigned, citing "a president who simply does not care.”
The Georgia congressional race didn’t show a party on pace to take back the House next year.
In the wake of last week’s special congressional election in Georgia, on which Democrats spent more than $30 million only to come up short, some on the left have taken solace in the idea that the result was nonetheless a good portent—a sign that Democratic candidates are poised to win the House next year.
The Georgia race, they point out, took place in a “very Republican district”—one that went for its Republican representative, Tom Price, by a 23-point margin last year. (Price triggered the special election when he took the job of health and human services secretary in the Trump administration.) Republican Karen Handel, by contrast, won by just 4 percentage points, 52 percent, compared to 48 percent for the Democrat, Jon Ossoff.
A professor of political philosophy counsels that “it is often possible to recognize and respect the moral integrity of others even when we disagree with them.”
For Michele Moody-Adams, a professor of political philosophy and legal theory at Columbia University, ensuring continued peace and prosperity in the United States depends not only on our ability to restore trust in government and the officials who run it. It is just as critical “that we figure out how to reawaken a sense of solidarity with each other as citizens, and to revive the belief that solidarity is best expressed by a commitment to shared sacrifice and an openness to constructive compromise.”
How to rebuild that sense of solidarity that has defined American life in moments of shared crisis, like World War II, and appears to have waned in more recent decades?
The sacrifices and compromises that matter are not just those associated with the demands of war or other national crises. We must learn, for instance, to relinquish resentments towards the ‘opposition’ when we lose out in a political contest and to refrain from smug self-righteousness when we win.
We must encourage our political leaders to be open to constructive compromise when political consensus is out of reach. We must also be more willing to tolerate the public expression of attitudes with which we disagree, and we must accept that even the best-designed legal institutions and practices may yield decisions which many believe to be mistaken. Democratic cooperation will always produce what John Rawls called the “strains of commitment,” and our continued flourishing as a democracy depends upon a readiness to acknowledge and accept these strains.
How leaders lose mental capacities—most notably for reading other people—that were essential to their rise
If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?
When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs—“You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)—failed to shake him awake.
New documents show how when given the opportunity, the Democratic Party was as ruthless as their GOP counterparts in trying to redistrict their rivals out of existence.
In spring 2011, the six Democratic members of Maryland’s congressional delegations tasked Eric Hawkins with two key jobs: Draw new district lines that get us re-elected easily for another five terms, while also taking direct aim at the state’s last two Republicans.
Behind closed doors, Democratic insiders and high-ranking aides referred to it as “the 7-1 map.” Hawkins—an analyst at a Beltway data firm called NCEC Services—not only made it happen, but imagined an 8-0 map that might have shut Republicans out of power altogether. That, however, would have required spreading Democratic voters a little too thin and made some incumbents slightly less safe; these congressmen were partisans, sure, but they were also reluctant to risk their own seats.