I appreciate the intense reactions to this blog so far, and respect the lingering skepticism. (Some of the nastiness I could do without, but it wouldn't be the Internet without some tasty pot-shots). I certainly didn't expect to win over the entire crowd with a handful of short overview pieces containing little evidence and no depth. I get that smart Atlantic readers are going to scrutinize this stuff.
After three years of discussing it among friends, I also understand that this is an issue that often provokes a visceral response. We all have strong opinions about how we became who we are. We need to have these opinions--it's a part of forming an identity. After a century of genetic dogma, terms like "innate" and "gifted" are baked right into our language and thinking. I don't mean to suggest that we all believe that genes control everything. Instead, most of us believe in "nature" followed by "nurture": genes dispense various design instructions as our body is formed in utero, priming us with a certain level of intellectual, creative, and athletic potential; following this, environmental influences develop that potential to some extent or another. This is what we know to be true, and it makes perfect sense.
Well, it turns out not to work that way. But no one familiar with the new science of development expects these old beliefs to wash simply away in a few weeks or months just because a few smart-ass writers come along and say they know better.
As we try to present this stuff, there are a hundred sand traps of understandable confusion. When I argue that "innate" doesn't really exist, it may seem like I'm making the blank slate argument -- which I'm not. When I argue that talent is a process, it may seem like I'm arguing that anyone can do anything, which I'm not. When I argue that we can't really see our individual potential until we've applied extraordinary resources over many years, it may seem like I'm arguing that genetic differences don't matter -- which I'm not. When I criticize The Bell Curve, it may look like I'm an agent of the left pushing a liberal egalitarian agenda, which I'm not.*
What I am pushing is the consideration of a whole new paradigm. In doing so, I am of course just a conduit.
While their evidence is quite complex, their renewed argument is simple: "nature vs. nurture" doesn't adequately explain how we become who we are. That notion needs to be replaced.
Lead author John Spencer:
The nature-nurture debate has a pervasive influence on our lives, affecting the framework of research in child development, biology, neuroscience, personality and dozens of other fields. People have tried for centuries to shift the debate one way or the other, and it's just been a pendulum swinging back and forth. We're taking the radical position that the smarter thing is to just say 'neither' -- to throw out the debate as it has been historically framed and embrace the alternative perspective provided by developmental systems theory.**
"Developmental systems theory" is a vague mouthful, and the scientists behind these observations readily admit that they haven't yet found the most compelling new language to present their ideas to the public. But the basic idea, as I've written in previous posts, is that genes are not static; they are dynamic. Genes interact with the environment to form traits. The more closely scientists look at claims of so-called "hard-wired" behavior and abilities, the more they turn up evidence that actions and talents are formed in conjunction with the culture around them.
John Spencer again:
Researchers sometimes claim we're hard-wired for things, but when you peel through the layers of the experiments, the details matter and suddenly the evidence doesn't seem so compelling...When people say there's an innate constraint, they're making suppositions about what came before the behavior in question. Instead of acknowledging that at 12 months a lot of development has already happened and we don't exactly know what came before this particular behavior, researchers take the easy way out and conclude that there must be inborn constraints. That's the predicament scientists have gotten themselves into.
Imprinting is one of many examples reviewed by the Iowa researchers. In 1935, Viennese zoologist Konrad Lorenz famously discovered that newborn chicks whose eggs were incubated in isolation will still correctly pick the call of their mother over another animal. It seemed the perfect little proof of innate ability.
But in 1997, Gilbert Gottlieb discovered the flaw in that assumption. It turned out that when fetal chicks were deprived of the ability to make vocal sounds inside their own eggs -- that is, the ability to teach themselves what their species sounded like -- they were unable to pick the correct maternal sound from various animals.
Another famously innate quality is "dead reckoning," the ability of fish, birds, and mammals (including humans) to establish one's current location based on past locations and movement history. How could young geese know how to fly home from 100 meters without trial and error? Being a mystery with no apparent answer, the word "innate" was again used as a catch-all explanation. Then it became clear that mother geese train their gosslings' navigational skills through daily walks.
How could baby chicks find their way back to a mother without clear sight of her? It turned out that they simply reversed the directions they had taken when getting lost.
One by one, the Iowa researchers show, scientists have declared basic abilities to be explainable only by hard-wiring only to later have a slow learning process revealed under closer inspection and better tools. The consistent refrain: abilities form in conjunction with development, community, and context. Genes matter, but actual results require genetic expression in conjunction with the environment.
(One big problem with this new paradigm, explains John Spencer, "is that it's much more complicated to explain why the evidence is on shaky ground, and often the one-liner wins out over the 10-minute explanation.")
The Iowa paper also delves deeply into claims of human language innateness, including what is known as "shape-bias." "Shape bias," the authors write, "simpliﬁes the word learning situation and thereby aids vocabulary development, but it is not innate. Rather, it is the emergent product of a step-by-step cascade."
What does all of this have to do with Einstein's genius or your piano playing? Developmental systems theory tells us that, while genetic differences do matter, they cannot, on their own, determine what we become. From there, the whole idea of innate talent falls apart.
As this blog continues, you'll meet more of the scientists who are documenting and shaping these ideas. One of the things I'd like to do is bring them together as a community and give their umbrella notion a more accessible name. "Developmental genetics" is one possibility. "Environmental genetics" is another.
Suggestions are welcome.
[Thanks to Mark Blumberg, one of the University of Iowa authors and editor-in-chief of Behavioral Neuroscience.]
* I am guilty of being a liberal on most issues, and there are elements of this new paradigm that gel nicely with a liberal sensibility; but there are also some very uncomfortable moral implications to come to terms with. Every writer has biases to be sure, but self-respecting journalists don't ignore or cherry-pick information because they like its political ramifications. I didn't write Data Smog because I wanted to bring down the Internet; I didn't offer some sanguine views on new surveillance technologies because I desire a police state, and I haven't been picking and choosing genetics and intelligence studies to prop up the Obama administration.
** These John Spencer quotes are taken from an University of Iowa press release about the journal article.
“Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage.”
Last Sunday the host of a popular news show asked me what it meant to lose my body. The host was broadcasting from Washington, D.C., and I was seated in a remote studio on the Far West Side of Manhattan. A satellite closed the miles between us, but no machinery could close the gap between her world and the world for which I had been summoned to speak. When the host asked me about my body, her face faded from the screen, and was replaced by a scroll of words, written by me earlier that week.
The host read these words for the audience, and when she finished she turned to the subject of my body, although she did not mention it specifically. But by now I am accustomed to intelligent people asking about the condition of my body without realizing the nature of their request. Specifically, the host wished to know why I felt that white America’s progress, or rather the progress of those Americans who believe that they are white, was built on looting and violence. Hearing this, I felt an old and indistinct sadness well up in me. The answer to this question is the record of the believers themselves. The answer is American history.
In Sunday’s referendum, voters firmly rejected Europe’s plan to bail out the country’s economy. What’s next?
Updated on July 5, 2015 4:57 pm
On Sunday, Greek citizens took to the polls in a controversial referendum asking them whether they support a plan calling for continued economic austerity in exchange for debt relief. Their answer—with more than 70 percent of the votes counted—was a resounding “no.”The outcome means that next steps for the nation, which has fallen into arrears with the IMF and imposed capital controls to prevent a run on the banks, is largely uncertain. According to reports from Reuters, the country may next attempt to secure financing by asking for more emergency funding from the European Central Bank.
The referendum—which had asked Greeks to vote “yes” or “no” on a proposal from Eurogroup leaders to extend financing to the deeply indebted country— was called for by Greek Prime Minister Alexis Tsipras amid meetings of Eurozone leaders trying to come up with a deal to allow the country to avoid default. The call for the referendum effectively ended those discussions.
Defining common cultural literacy for an increasingly diverse nation
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
One reason for the continued resistance to the Affordable Care Act is a badly distorted narrative of how it became law.
The U.S. Supreme Court’s remarkable 6-3 decision in King v. Burwell saves the Affordable Care Act from evisceration, although Obamacare will undoubtedly face a continuing pattern of guerrilla attacks from Congress, the courts, and Republican governors and state legislatures. Still, as many observers have pointed out, the core elements of the plan, including the exchanges, the subsidies, the individual mandate, the expansion of coverage in family plans to children 26 and under, and the elimination of lifetime limits and of preexisting conditions as bars to coverage, are almost certainly here to stay.
This is, of course, a huge victory for President Obama. But the passage, implementation, and ratification of Obamacare continue to be plagued by a widespread belief that it was tarnished by the way it was proposed and debated. A raft of reporters, commentators, and politicians argue that the president made a huge mistake in taking up healthcare at the beginning of his term, before building relationships of trust with Republicans, and then compounded that error by jamming it through quickly without any Republican input or efforts to find common ground.
The show reveals what happened to Ray, while Bezzerides and Woodrugh investigate the mayor, and Frank indulges in some amateur dentistry.
Orr: More than a third of the way into this season of True Detective, I’d say that the two best scenes so far were adjacent ones, albeit ones in consecutive episodes: the last scene of episode two—the man in the bird mask appearing out of nowhere, the stunning (apparent) death of a principal character as the radio plays “I Pity the Fool”—and the first scene of tonight’s episode: Ray and his father in the bar, and yet clearly someplace else altogether, someplace otherworldly. “Where is this?” Ray asks. His dad replies, “I don’t know. You were here first.” Is this Ray’s dying vision? Is he a ghost who will watch the season unfold from beyond the grave?
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
New data shows that students whose parents make less money pursue more “useful” subjects, such as math or physics.
In 1780, John Adams wrote a letter to his wife, Abigail, in which he laid out his plans for what his children and grandchildren would devote their lives to. Having himself taken the time to master “Politicks and War,” two revolutionary necessities, Adams hoped his children would go into disciplines that promoted nation-building, such as “mathematicks,” “navigation,” and “commerce.” His plan was that in turn, those practical subjects would give his children’s children room “to study painting, poetry, musick, architecture, statuary, tapestry, and porcelaine.”
Two-hundred and thirty-five years later, this progression—“from warriors to dilettantes,” in the words of the literary scholar Geoffrey Galt Harpham—plays out much as Adams hoped it would: Once financial concerns have been covered by their parents, children have more latitude to study less pragmatic things in school. Kim Weeden, a sociologist at Cornell, looked at National Center for Education Statistics data for me after I asked her about this phenomenon, and her analysis revealed that, yes, the amount of money a college student’s parents make does correlate with what that person studies. Kids from lower-income families tend toward “useful” majors, such as computer science, math, and physics. Those whose parents make more money flock to history, English, and performing arts.
The unwillingness of the former secretary of state to take questions from the press contrasts sharply with Jeb Bush’s marked affinity for public disclosure.
Howard Kurtz reported on Sunday night that the Hillary Clinton campaign has decided to open itself to more press interviews. Kurtz quoted the campaign’s communications director, Jennifer Palmieri: “By not doing national interviews until now, Palmieri concedes, ‘we’re sacrificing the coverage. We’re paying a price for it.’”
Meanwhile Jeb Bush chatted July 2 with the conservative website, the Daily Caller. The Daily Caller interview broke an unusually protracted no-interview period for Bush. It had been more than two weeks since he appeared on the Tonight show with Jimmy Fallon. Bush spoke that same day, June 17, to Sean Hannity’s radio show and ABC News. Five days earlier, he’d spoken to Germany’s Der Spiegel—altogether, five interviews in the month of June. That brought his total, since the beginning of February, to 39, according to the Bush campaign.*
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.