I appreciate the intense reactions to this blog so far, and respect the lingering skepticism. (Some of the nastiness I could do without, but it wouldn't be the Internet without some tasty pot-shots). I certainly didn't expect to win over the entire crowd with a handful of short overview pieces containing little evidence and no depth. I get that smart Atlantic readers are going to scrutinize this stuff.
After three years of discussing it among friends, I also understand that this is an issue that often provokes a visceral response. We all have strong opinions about how we became who we are. We need to have these opinions--it's a part of forming an identity. After a century of genetic dogma, terms like "innate" and "gifted" are baked right into our language and thinking. I don't mean to suggest that we all believe that genes control everything. Instead, most of us believe in "nature" followed by "nurture": genes dispense various design instructions as our body is formed in utero, priming us with a certain level of intellectual, creative, and athletic potential; following this, environmental influences develop that potential to some extent or another. This is what we know to be true, and it makes perfect sense.
Well, it turns out not to work that way. But no one familiar with the new science of development expects these old beliefs to wash simply away in a few weeks or months just because a few smart-ass writers come along and say they know better.
As we try to present this stuff, there are a hundred sand traps of understandable confusion. When I argue that "innate" doesn't really exist, it may seem like I'm making the blank slate argument -- which I'm not. When I argue that talent is a process, it may seem like I'm arguing that anyone can do anything, which I'm not. When I argue that we can't really see our individual potential until we've applied extraordinary resources over many years, it may seem like I'm arguing that genetic differences don't matter -- which I'm not. When I criticize The Bell Curve, it may look like I'm an agent of the left pushing a liberal egalitarian agenda, which I'm not.*
What I am pushing is the consideration of a whole new paradigm. In doing so, I am of course just a conduit.
While their evidence is quite complex, their renewed argument is simple: "nature vs. nurture" doesn't adequately explain how we become who we are. That notion needs to be replaced.
Lead author John Spencer:
The nature-nurture debate has a pervasive influence on our lives, affecting the framework of research in child development, biology, neuroscience, personality and dozens of other fields. People have tried for centuries to shift the debate one way or the other, and it's just been a pendulum swinging back and forth. We're taking the radical position that the smarter thing is to just say 'neither' -- to throw out the debate as it has been historically framed and embrace the alternative perspective provided by developmental systems theory.**
"Developmental systems theory" is a vague mouthful, and the scientists behind these observations readily admit that they haven't yet found the most compelling new language to present their ideas to the public. But the basic idea, as I've written in previous posts, is that genes are not static; they are dynamic. Genes interact with the environment to form traits. The more closely scientists look at claims of so-called "hard-wired" behavior and abilities, the more they turn up evidence that actions and talents are formed in conjunction with the culture around them.
John Spencer again:
Researchers sometimes claim we're hard-wired for things, but when you peel through the layers of the experiments, the details matter and suddenly the evidence doesn't seem so compelling...When people say there's an innate constraint, they're making suppositions about what came before the behavior in question. Instead of acknowledging that at 12 months a lot of development has already happened and we don't exactly know what came before this particular behavior, researchers take the easy way out and conclude that there must be inborn constraints. That's the predicament scientists have gotten themselves into.
Imprinting is one of many examples reviewed by the Iowa researchers. In 1935, Viennese zoologist Konrad Lorenz famously discovered that newborn chicks whose eggs were incubated in isolation will still correctly pick the call of their mother over another animal. It seemed the perfect little proof of innate ability.
But in 1997, Gilbert Gottlieb discovered the flaw in that assumption. It turned out that when fetal chicks were deprived of the ability to make vocal sounds inside their own eggs -- that is, the ability to teach themselves what their species sounded like -- they were unable to pick the correct maternal sound from various animals.
Another famously innate quality is "dead reckoning," the ability of fish, birds, and mammals (including humans) to establish one's current location based on past locations and movement history. How could young geese know how to fly home from 100 meters without trial and error? Being a mystery with no apparent answer, the word "innate" was again used as a catch-all explanation. Then it became clear that mother geese train their gosslings' navigational skills through daily walks.
How could baby chicks find their way back to a mother without clear sight of her? It turned out that they simply reversed the directions they had taken when getting lost.
One by one, the Iowa researchers show, scientists have declared basic abilities to be explainable only by hard-wiring only to later have a slow learning process revealed under closer inspection and better tools. The consistent refrain: abilities form in conjunction with development, community, and context. Genes matter, but actual results require genetic expression in conjunction with the environment.
(One big problem with this new paradigm, explains John Spencer, "is that it's much more complicated to explain why the evidence is on shaky ground, and often the one-liner wins out over the 10-minute explanation.")
The Iowa paper also delves deeply into claims of human language innateness, including what is known as "shape-bias." "Shape bias," the authors write, "simpliﬁes the word learning situation and thereby aids vocabulary development, but it is not innate. Rather, it is the emergent product of a step-by-step cascade."
What does all of this have to do with Einstein's genius or your piano playing? Developmental systems theory tells us that, while genetic differences do matter, they cannot, on their own, determine what we become. From there, the whole idea of innate talent falls apart.
As this blog continues, you'll meet more of the scientists who are documenting and shaping these ideas. One of the things I'd like to do is bring them together as a community and give their umbrella notion a more accessible name. "Developmental genetics" is one possibility. "Environmental genetics" is another.
Suggestions are welcome.
[Thanks to Mark Blumberg, one of the University of Iowa authors and editor-in-chief of Behavioral Neuroscience.]
* I am guilty of being a liberal on most issues, and there are elements of this new paradigm that gel nicely with a liberal sensibility; but there are also some very uncomfortable moral implications to come to terms with. Every writer has biases to be sure, but self-respecting journalists don't ignore or cherry-pick information because they like its political ramifications. I didn't write Data Smog because I wanted to bring down the Internet; I didn't offer some sanguine views on new surveillance technologies because I desire a police state, and I haven't been picking and choosing genetics and intelligence studies to prop up the Obama administration.
** These John Spencer quotes are taken from an University of Iowa press release about the journal article.
19 Kids and Counting built its reputation on preaching family values, but the mass-media platforms that made the family famous might also be their undoing.
On Thursday, news broke that Josh Duggar, the oldest son of the Duggar family's 19 children, had, as a teenager, allegedly molested five underage girls. Four of them, allegedly, were his sisters.
The information came to light because, in 2006—two years before 17 Kids and Counting first aired on TLC, and thus two years before the Duggars became reality-TV celebrities—the family recorded an appearance on TheOprah Winfrey Show. Before the taping, an anonymous source sent an email to Harpo warning the production company Josh’s alleged molestation. Harpo forwarded the email to authorities, triggering a police investigation (the Oprah appearance never aired). The news was reported this week by In Touch Weekly—after the magazine filed a Freedom of Information Act request to see the police report on the case—and then confirmed by the Duggars in a statement posted on Facebook.
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
In an interview, the U.S. president ties his legacy to a pact with Tehran, argues ISIS is not winning, warns Saudi Arabia not to pursue a nuclear-weapons program, and anguishes about Israel.
On Tuesday afternoon, as President Obama was bringing an occasionally contentious but often illuminating hour-long conversation about the Middle East to an end, I brought up a persistent worry. “A majority of American Jews want to support the Iran deal,” I said, “but a lot of people are anxiety-ridden about this, as am I.” Like many Jews—and also, by the way, many non-Jews—I believe that it is prudent to keep nuclear weapons out of the hands of anti-Semitic regimes. Obama, who earlier in the discussion had explicitly labeled the supreme leader of Iran, Ayatollah Ali Khamenei, an anti-Semite, responded with an argument I had not heard him make before.
“Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said, referring to the apparently almost-finished nuclear agreement between Iran and a group of world powers led by the United States. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Why agriculture may someday take place in towers, not fields
A couple of Octobers ago, I found myself standing on a 5,000-acre cotton crop in the outskirts of Lubbock, Texas, shoulder-to-shoulder with a third-generation cotton farmer. He swept his arm across the flat, brown horizon of his field, which was at that moment being plowed by an industrial-sized picker—a toothy machine as tall as a house and operated by one man. The picker’s yields were being dropped into a giant pod to be delivered late that night to the local gin. And far beneath our feet, the Ogallala aquifer dwindled away at its frighteningly swift pace. When asked about this, the farmer spoke of reverse osmosis—the process of desalinating water—which he seemed to put his faith in, and which kept him unafraid of famine and permanent drought.
Singapore’s mind-bending logical riddles are so last month. Enter: Vietnam, the latest country to be swept up in what could easily be known as “the viral-math epidemic of 2015.”
This one might even trump its Singaporean predecessor, which became a global legend earlier this year. That quandary, for those who aren’t familiar with it, asked fifth-graders to figure out the birthday of a certain “Cheryl,” who gave two of her friends—“Albert” and “Bernard”—a list of 10 possible dates. She then privately told Albert the month, and Bernard the day. (“Albert: I don’t know when Cheryl’s birthday is, but I know that Bernard does not know too. Bernard: At first I don’t know when Cheryl’s birthday is, but I now know. Albert: Then I also know when Cheryl’s birthday is.”)
In any case, people have probably heard the phrase in reference to something gone awry at work or in life. In either setting, when the shit does hit the fan, people will tend to look to the most competent person in the room to take over.
And too bad for that person. A new paper by a team of researchers from Duke University, University of Georgia, and University of Colorado looks at not only how extremely competent people are treated by their co-workers and peers, but how those people feel when, at crucial moments, everyone turns to them. They find that responsible employees are not terribly pleased about this dynamic either.
A scholar’s analysis of American culture presumes too much.
Last week, Gawkerinterviewed Robin DiAngelo, a professor of multicultural education at Westfield State University. She discussed aspects of her thinking on whiteness, which are set forth at length in her book, What Does it Mean to be White? I’ve ordered the book.
Meanwhile, her remarks on police brutality piqued my interest. Some of what Professor DiAngelo said is grounded in solid empirical evidence: blacks and Hispanics are disproportionately victimized by misbehaving police officers; there are neighborhoods where police help maintain racial and class boundaries. And if our culture, which she calls “the water we swim in,” contained fewer parts racism per million, I suspect that police brutality would be less common.
Advocates say that a guaranteed basic income can lead to more creative, fulfilling work. The question is how to fund it.
Scott Santens has been thinking a lot about fish lately. Specifically, he’s been reflecting on the aphorism, “If you give a man a fish, he eats for a day. If you teach a man to fish, he eats for life.” What Santens wants to know is this: “If you build a robot to fish, do all men starve, or do all men eat?”
Santens is 37 years old, and he’s a leader in the basic income movement—a worldwide network of thousands of advocates (26,000 on Reddit alone) who believe that governments should provide every citizen with a monthly stipend big enough to cover life’s basic necessities. The idea of a basic income has been around for decades, and it once drew support from leaders as different as Martin Luther King Jr. and Richard Nixon. But rather than waiting for governments to act, Santens has started crowdfunding his own basic income of $1,000 per month. He’s nearly halfway to his his goal.