I appreciate the intense reactions to this blog so far, and respect the lingering skepticism. (Some of the nastiness I could do without, but it wouldn't be the Internet without some tasty pot-shots). I certainly didn't expect to win over the entire crowd with a handful of short overview pieces containing little evidence and no depth. I get that smart Atlantic readers are going to scrutinize this stuff.
After three years of discussing it among friends, I also understand that this is an issue that often provokes a visceral response. We all have strong opinions about how we became who we are. We need to have these opinions--it's a part of forming an identity. After a century of genetic dogma, terms like "innate" and "gifted" are baked right into our language and thinking. I don't mean to suggest that we all believe that genes control everything. Instead, most of us believe in "nature" followed by "nurture": genes dispense various design instructions as our body is formed in utero, priming us with a certain level of intellectual, creative, and athletic potential; following this, environmental influences develop that potential to some extent or another. This is what we know to be true, and it makes perfect sense.
Well, it turns out not to work that way. But no one familiar with the new science of development expects these old beliefs to wash simply away in a few weeks or months just because a few smart-ass writers come along and say they know better.
As we try to present this stuff, there are a hundred sand traps of understandable confusion. When I argue that "innate" doesn't really exist, it may seem like I'm making the blank slate argument -- which I'm not. When I argue that talent is a process, it may seem like I'm arguing that anyone can do anything, which I'm not. When I argue that we can't really see our individual potential until we've applied extraordinary resources over many years, it may seem like I'm arguing that genetic differences don't matter -- which I'm not. When I criticize The Bell Curve, it may look like I'm an agent of the left pushing a liberal egalitarian agenda, which I'm not.*
What I am pushing is the consideration of a whole new paradigm. In doing so, I am of course just a conduit.
While their evidence is quite complex, their renewed argument is simple: "nature vs. nurture" doesn't adequately explain how we become who we are. That notion needs to be replaced.
Lead author John Spencer:
The nature-nurture debate has a pervasive influence on our lives, affecting the framework of research in child development, biology, neuroscience, personality and dozens of other fields. People have tried for centuries to shift the debate one way or the other, and it's just been a pendulum swinging back and forth. We're taking the radical position that the smarter thing is to just say 'neither' -- to throw out the debate as it has been historically framed and embrace the alternative perspective provided by developmental systems theory.**
"Developmental systems theory" is a vague mouthful, and the scientists behind these observations readily admit that they haven't yet found the most compelling new language to present their ideas to the public. But the basic idea, as I've written in previous posts, is that genes are not static; they are dynamic. Genes interact with the environment to form traits. The more closely scientists look at claims of so-called "hard-wired" behavior and abilities, the more they turn up evidence that actions and talents are formed in conjunction with the culture around them.
John Spencer again:
Researchers sometimes claim we're hard-wired for things, but when you peel through the layers of the experiments, the details matter and suddenly the evidence doesn't seem so compelling...When people say there's an innate constraint, they're making suppositions about what came before the behavior in question. Instead of acknowledging that at 12 months a lot of development has already happened and we don't exactly know what came before this particular behavior, researchers take the easy way out and conclude that there must be inborn constraints. That's the predicament scientists have gotten themselves into.
Imprinting is one of many examples reviewed by the Iowa researchers. In 1935, Viennese zoologist Konrad Lorenz famously discovered that newborn chicks whose eggs were incubated in isolation will still correctly pick the call of their mother over another animal. It seemed the perfect little proof of innate ability.
But in 1997, Gilbert Gottlieb discovered the flaw in that assumption. It turned out that when fetal chicks were deprived of the ability to make vocal sounds inside their own eggs -- that is, the ability to teach themselves what their species sounded like -- they were unable to pick the correct maternal sound from various animals.
Another famously innate quality is "dead reckoning," the ability of fish, birds, and mammals (including humans) to establish one's current location based on past locations and movement history. How could young geese know how to fly home from 100 meters without trial and error? Being a mystery with no apparent answer, the word "innate" was again used as a catch-all explanation. Then it became clear that mother geese train their gosslings' navigational skills through daily walks.
How could baby chicks find their way back to a mother without clear sight of her? It turned out that they simply reversed the directions they had taken when getting lost.
One by one, the Iowa researchers show, scientists have declared basic abilities to be explainable only by hard-wiring only to later have a slow learning process revealed under closer inspection and better tools. The consistent refrain: abilities form in conjunction with development, community, and context. Genes matter, but actual results require genetic expression in conjunction with the environment.
(One big problem with this new paradigm, explains John Spencer, "is that it's much more complicated to explain why the evidence is on shaky ground, and often the one-liner wins out over the 10-minute explanation.")
The Iowa paper also delves deeply into claims of human language innateness, including what is known as "shape-bias." "Shape bias," the authors write, "simpliﬁes the word learning situation and thereby aids vocabulary development, but it is not innate. Rather, it is the emergent product of a step-by-step cascade."
What does all of this have to do with Einstein's genius or your piano playing? Developmental systems theory tells us that, while genetic differences do matter, they cannot, on their own, determine what we become. From there, the whole idea of innate talent falls apart.
As this blog continues, you'll meet more of the scientists who are documenting and shaping these ideas. One of the things I'd like to do is bring them together as a community and give their umbrella notion a more accessible name. "Developmental genetics" is one possibility. "Environmental genetics" is another.
Suggestions are welcome.
[Thanks to Mark Blumberg, one of the University of Iowa authors and editor-in-chief of Behavioral Neuroscience.]
* I am guilty of being a liberal on most issues, and there are elements of this new paradigm that gel nicely with a liberal sensibility; but there are also some very uncomfortable moral implications to come to terms with. Every writer has biases to be sure, but self-respecting journalists don't ignore or cherry-pick information because they like its political ramifications. I didn't write Data Smog because I wanted to bring down the Internet; I didn't offer some sanguine views on new surveillance technologies because I desire a police state, and I haven't been picking and choosing genetics and intelligence studies to prop up the Obama administration.
** These John Spencer quotes are taken from an University of Iowa press release about the journal article.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
Highly-poisonous botulinum toxin (the stuff in Botox), played a formidable role in the history of food and warfare. It is still a factor in prison-brewed alcohol and some canned foods, and can quickly kill a person.
After tanking up on “pruno,” a bootleg prison wine, eight maximum-security inmates at the Utah State prison in Salt Lake County tried to shake off more than just the average hangover. Their buzz faded into double vision, weakness, trouble swallowing, and vomiting. Tests confirmed that the detainees came down with botulism from their cellblock science experiment. In secret, a prison moonshiner mixed grapefruit, oranges, powdered drink mix, canned fruit, and water in a plastic bag. For the pièce de résistance, he added a baked potato filched from a meal tray weeks earlier and peeled with his fingernails. After days of fermentation and anticipation, the brewer filtered the mash through a sock, and then doled out the hooch to his fellow yardbirds.
One hundred years ago, a crisis in urban masculinity created the lumberjack aesthetic. Now it's making a comeback.
The first one I met was at an inauguration party in 2009. I was in a cocktail dress. He was in jeans, work boots, and a flannel shirt. He had John Henry tattooed on his bicep. He was white. Somehow, at a fairly elegant affair, he had found a can of PBR. Since then they’ve multiplied. You can see them in coffee shops and bars and artisanal butchers. They don't exactly cut down trees, but they might try their hand at agriculture and woodworking, even if only in the form of window-box herb gardens.
In the last month, these bearded, manly men even earned themselves a pithy nickname: the lumbersexuals. GearJunkiecoined the term only a few weeks ago, and since then Jezebel, Gawker, The Guardian and Time have jumped in to analyze their style. BuzzFeed even has a holiday gift guide for the lumbersexual in your life. (He would, apparently, like bourbon-flavored syrup and beard oil.)
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”
It was widely seen as a counter-argument to claims that poor people are "to blame" for bad decisions and a rebuke to policies that withhold money from the poorest families unless they behave in a certain way. After all, if being poor leads to bad decision-making (as opposed to the other way around), then giving cash should alleviate the cognitive burdens of poverty, all on its own.
Sometimes, science doesn't stick without a proper anecdote, and "Why I Make Terrible Decisions," a comment published on Gawker's Kinja platform by a person in poverty, is a devastating illustration of the Science study. I've bolded what I found the most moving, insightful portions, but it's a moving and insightful testimony all the way through.
Better-informed consumers are ditching the bowls of sugar that were once a triumph of 20th-century marketing.
Last year, General Mills launched a new product aimed at health-conscious customers: Cheerios Protein, a version of its popular cereal made with whole-grain oats and lentils. Early reviews were favorable. The cereal, Huffington Post reported, tasted mostly like regular Cheerios, although “it seemed like they were sweetened and flavored a little more aggressively.” Meanwhile, ads boasted that the cereal would offer “long-lasting energy” as opposed to a sugar crash.
But earlier this month, the Center for Science in the Public Interest sued General Mills, saying that there’s very little extra protein in Cheerios Protein compared to the original brand and an awful lot more sugar—17 times as much, in fact. So why would General Mills try to market a product as containing protein when it’s really a box fill of carbs and refined sugar?
One reason the underprivileged face an obesity crisis is that they rely on ineffective weight-loss strategies. In part, this is because economic uncertainty makes it harder to plan for workouts and healthy meals.
Poor people—and poor women in particular—are more likely to be overweight and obese. But what makes the obesity epidemic such a tough problem to solve is that the poorest Americans are also less likely to use proven weight-loss strategies, relying instead on quick fixes like diet pills.
For a new study published in the American Journal of Preventive Medicine, researchers from Concordia University looked at the incomes and health habits of more than 3,000 children and teens between the ages of 8 and 19 and more than 5,000 adults over the age of 20.
At least two-thirds of the study subjects reported attempting to reduce food intake or exercising in order to lose weight in the past year. Despite these efforts, the adults in the study gained an average of three pounds, while the youths gained about 12 pounds. The people in the lower income brackets gained about two pounds more than those in the highest one.
The statesman understood something most diplomats don’t: history—and how to apply it.
In his new biography of Henry Kissinger, the historian Niall Ferguson recalls that halfway through what became an eight-year research project, he had an epiphany. Tracing the story of how a young man from Nazi Germany became America’s greatest living statesman, he discovered not only the essence of Kissinger’s statecraft, but the missing gene in modern American diplomacy: an understanding of history.
For Ferguson, it was a humbling revelation. As he confesses in the introduction to Kissinger: “In researching the life and times of Henry Kissinger, I have come to realize that my approach was unsubtle. In particular, I had missed the crucial importance in American foreign policy of the history deficit: The fact that key decision-makers know almost nothing not just of other countries’ pasts but also of their own. Worse, they often do not see what is wrong with their ignorance.”
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
The Nebraska state government, realizing the tremendous mistake it had made, held a special session of the legislature to rewrite the law in order to add an age limitation. Governor Dave Heineman said the change would "put the focus back on the original intent of these laws, which is saving newborn babies and exempting a parent from prosecution for child abandonment. It should also prevent those outside the state from bringing their children to Nebraska in an attempt to secure services."