I appreciate the intense reactions to this blog so far, and respect the lingering skepticism. (Some of the nastiness I could do without, but it wouldn't be the Internet without some tasty pot-shots). I certainly didn't expect to win over the entire crowd with a handful of short overview pieces containing little evidence and no depth. I get that smart Atlantic readers are going to scrutinize this stuff.
After three years of discussing it among friends, I also understand that this is an issue that often provokes a visceral response. We all have strong opinions about how we became who we are. We need to have these opinions--it's a part of forming an identity. After a century of genetic dogma, terms like "innate" and "gifted" are baked right into our language and thinking. I don't mean to suggest that we all believe that genes control everything. Instead, most of us believe in "nature" followed by "nurture": genes dispense various design instructions as our body is formed in utero, priming us with a certain level of intellectual, creative, and athletic potential; following this, environmental influences develop that potential to some extent or another. This is what we know to be true, and it makes perfect sense.
Well, it turns out not to work that way. But no one familiar with the new science of development expects these old beliefs to wash simply away in a few weeks or months just because a few smart-ass writers come along and say they know better.
As we try to present this stuff, there are a hundred sand traps of understandable confusion. When I argue that "innate" doesn't really exist, it may seem like I'm making the blank slate argument -- which I'm not. When I argue that talent is a process, it may seem like I'm arguing that anyone can do anything, which I'm not. When I argue that we can't really see our individual potential until we've applied extraordinary resources over many years, it may seem like I'm arguing that genetic differences don't matter -- which I'm not. When I criticize The Bell Curve, it may look like I'm an agent of the left pushing a liberal egalitarian agenda, which I'm not.*
What I am pushing is the consideration of a whole new paradigm. In doing so, I am of course just a conduit.
While their evidence is quite complex, their renewed argument is simple: "nature vs. nurture" doesn't adequately explain how we become who we are. That notion needs to be replaced.
Lead author John Spencer:
The nature-nurture debate has a pervasive influence on our lives, affecting the framework of research in child development, biology, neuroscience, personality and dozens of other fields. People have tried for centuries to shift the debate one way or the other, and it's just been a pendulum swinging back and forth. We're taking the radical position that the smarter thing is to just say 'neither' -- to throw out the debate as it has been historically framed and embrace the alternative perspective provided by developmental systems theory.**
"Developmental systems theory" is a vague mouthful, and the scientists behind these observations readily admit that they haven't yet found the most compelling new language to present their ideas to the public. But the basic idea, as I've written in previous posts, is that genes are not static; they are dynamic. Genes interact with the environment to form traits. The more closely scientists look at claims of so-called "hard-wired" behavior and abilities, the more they turn up evidence that actions and talents are formed in conjunction with the culture around them.
John Spencer again:
Researchers sometimes claim we're hard-wired for things, but when you peel through the layers of the experiments, the details matter and suddenly the evidence doesn't seem so compelling...When people say there's an innate constraint, they're making suppositions about what came before the behavior in question. Instead of acknowledging that at 12 months a lot of development has already happened and we don't exactly know what came before this particular behavior, researchers take the easy way out and conclude that there must be inborn constraints. That's the predicament scientists have gotten themselves into.
Imprinting is one of many examples reviewed by the Iowa researchers. In 1935, Viennese zoologist Konrad Lorenz famously discovered that newborn chicks whose eggs were incubated in isolation will still correctly pick the call of their mother over another animal. It seemed the perfect little proof of innate ability.
But in 1997, Gilbert Gottlieb discovered the flaw in that assumption. It turned out that when fetal chicks were deprived of the ability to make vocal sounds inside their own eggs -- that is, the ability to teach themselves what their species sounded like -- they were unable to pick the correct maternal sound from various animals.
Another famously innate quality is "dead reckoning," the ability of fish, birds, and mammals (including humans) to establish one's current location based on past locations and movement history. How could young geese know how to fly home from 100 meters without trial and error? Being a mystery with no apparent answer, the word "innate" was again used as a catch-all explanation. Then it became clear that mother geese train their gosslings' navigational skills through daily walks.
How could baby chicks find their way back to a mother without clear sight of her? It turned out that they simply reversed the directions they had taken when getting lost.
One by one, the Iowa researchers show, scientists have declared basic abilities to be explainable only by hard-wiring only to later have a slow learning process revealed under closer inspection and better tools. The consistent refrain: abilities form in conjunction with development, community, and context. Genes matter, but actual results require genetic expression in conjunction with the environment.
(One big problem with this new paradigm, explains John Spencer, "is that it's much more complicated to explain why the evidence is on shaky ground, and often the one-liner wins out over the 10-minute explanation.")
The Iowa paper also delves deeply into claims of human language innateness, including what is known as "shape-bias." "Shape bias," the authors write, "simpliﬁes the word learning situation and thereby aids vocabulary development, but it is not innate. Rather, it is the emergent product of a step-by-step cascade."
What does all of this have to do with Einstein's genius or your piano playing? Developmental systems theory tells us that, while genetic differences do matter, they cannot, on their own, determine what we become. From there, the whole idea of innate talent falls apart.
As this blog continues, you'll meet more of the scientists who are documenting and shaping these ideas. One of the things I'd like to do is bring them together as a community and give their umbrella notion a more accessible name. "Developmental genetics" is one possibility. "Environmental genetics" is another.
Suggestions are welcome.
[Thanks to Mark Blumberg, one of the University of Iowa authors and editor-in-chief of Behavioral Neuroscience.]
* I am guilty of being a liberal on most issues, and there are elements of this new paradigm that gel nicely with a liberal sensibility; but there are also some very uncomfortable moral implications to come to terms with. Every writer has biases to be sure, but self-respecting journalists don't ignore or cherry-pick information because they like its political ramifications. I didn't write Data Smog because I wanted to bring down the Internet; I didn't offer some sanguine views on new surveillance technologies because I desire a police state, and I haven't been picking and choosing genetics and intelligence studies to prop up the Obama administration.
** These John Spencer quotes are taken from an University of Iowa press release about the journal article.
After more than a year of rumors and speculation, Bruce Jenner publicly came out as transgender with four simple words: “I am a woman.”
“My brain is much more female than male,” he explained to Diane Sawyer, who conducted a primetime interview with Jenner on ABC Friday night. (Jenner indicated he prefers to be addressed with male pronouns at this time.) During the two-hour program, Jenner discussed his personal struggle with gender dysphoria and personal identity, how it shaped his past and current relationships and marriages, and how he finally told his family about his true gender identity.
The show went to impressive lengths to explain unfamiliar concepts of gender and sexuality to its audience, although it didn't always go smoothly. Sawyer’s questions occasionally came off as awkward and tone-deaf, mirroring a broader lack of understanding by many Americans about the difficulties that trans people face. But Sawyer’s empathy also shone when explaining concepts like gender identity and transitioning to her audience—a rare experience on primetime American television. It was a powerful signal of how much progress the LGBT movement has made over the past twenty years, even though the T in that acronym still lags behind the other three letters in both social acceptance and legal protections, and in how much progress remains to be made.
In her new book No One Understands You and What To Do About It, Heidi Grant Halvorson tells readers a story about her friend, Tim. When Tim started a new job as a manager, one of his top priorities was communicating to his team that he valued each member’s input. So at team meetings, as each member spoke up about whatever project they were working on, Tim made sure he put on his “active-listening face” to signal that he cared about what each person was saying.
But after meeting with him a few times, Tim’s team got a very different message from the one he intended to send. “After a few weeks of meetings,” Halvorson explains, “one team member finally summoned up the courage to ask him the question that had been on everyone’s mind.” That question was: “Tim, are you angry with us right now?” When Tim explained that he wasn’t at all angry—that he was just putting on his “active-listening face”—his colleague gently explained that his active-listening face looked a lot like his angry face.
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
In India’s state of Uttar Pradesh, the village of Kannauj lies a dusty four-hour drive east of the Taj Mahal, the white-marbled wonder built by the Mughal emperor Shah Jahan in memory of his third and favorite wife. Empress Mumtaz Mahal died in 1631 giving birth to their 13th child. The Taj is Jahan’s grand paean to lost love. But he also mourned his queen in much more personal ways. For one thing, Jahan never again wore perfume. Fragrant oils—known in India as attars—had been one of the couple’s great shared passions.
Then and now, Kannauj was the place to fetch the fine scents—jasmine oils, rose waters, the roots of grasses called vetiver, with a bouquet cooling to the nose. Exactly when attar-making began there, no one is certain; archaeologists have unearthed clay distillation pots dating back thousands of years to the ancient Harappan civilization of the Indus Valley. But today, Kannauj is a hub of a historic perfumery that draws much of the town to the same pursuit. Most of the villagers there are connected to fragrance in one way or another—from sinewy craftsmen who steam petals over wood fires in hulking copper pots to mothers who roll incense sticks in the shade while their toddlers nap on colorful mats nearby.
Leon Trotsky is not often invoked as a management guru, but a line frequently attributed to him would surely resonate with many business leaders today. “You may not be interested in war,” the Bolshevik revolutionary is said to have warned, “but war is interested in you.” War, or at least geopolitics, is figuring more and more prominently in the thinking and fortunes of large businesses.
Of course, multinational companies such as Shell and GE have long cultivated an expertise in geopolitics. But the intensity of concern over global instability is much higher now than in any recent period. In 2013, the private-equity colossus KKR named the retired general and CIA director David Petraeus as the chairman of its global institute, which informs the firm’s investment decisions. Earlier this year, Sir John Sawers, the former head of MI6, Britain’s CIA, became the chairman of Macro Advisory Partners, a firm that advises businesses and governments on geopolitics. Both appointments are high-profile examples of a much wider trend: an increasing number of corporations are hiring political scientists, starting their board meetings with geopolitical briefings, and seeking the advice of former diplomats, spymasters, and military leaders.“The last three years have definitely been a wake-up call for business on geopolitics,” Dominic Barton, the managing director of McKinsey, told me. “I’ve not seen anything like it. Since the Second World War, I don’t think you’ve seen such volatility.” Most businesses haven’t pulled back meaningfully from globalized operation, Barton said. “But they are thinking, Gosh, what’s next?”
The editors of Smithsonian magazine have announced the winners of their 12th annual photo contest, selected from more than 26,500 entries. The winning photographs from from the competition's six categories are published below: The Natural World, Travel, People, Americana, Altered Images and Mobile. Also, a few finalists have been included as well. Captions were written by the photographers. Be sure to visit the contest page at Smithsonian.com to see all the winners and finalists.
This month, many of the nation's best and brightest high school seniors will receive thick envelopes in the mail announcing their admission to the college of their dreams. According to a 2011 survey, about 60 percent of them will go to their first-choice schools. For many of them, going away to college will be like crossing the Rubicon. They will leave their families -- their homes -- and probably not return for many years, if at all.
That was journalist Rod Dreher's path. Dreher grew up in the small southern community of Starhill, Louisiana, 35 miles northwest of Baton Rouge. His family goes back five generations there. His father was a part-time farmer and sanitarian; his mother drove a school bus. His younger sister Ruthie loved hunting and fishing, even as a little girl.
On Inauguration Day 2013, a few minutes after 12 p.m., Raffi Hovannisian stood before a massive crowd at Liberty Square in the heart of Yerevan, Armenia. Thousands of Armenians had gathered in the capital to cheer on their leader: “Raffi! President! Raffi! President!” The man before them was tall and dynamic, his fist thrown into the air like a high-school football star. He drew himself to the microphone and thundered over the crowd: “Armenia! Armenia!” The people whistled and cheered. Many of them did not notice that they were being surrounded by riot police with red berets, reinforced by special units of the armed forces.
At exactly the same time, a few kilometers up a hill, Serzh Sargsyan was taking the oath of office for the presidency of the Republic of Armenia. The entire government was in attendance—all the church leaders, too. The official results had been clear about the incumbent’s victory, with 59 percent of the vote. The man on stage was short, with silver hair and the disciplined expression of a military commander. He spoke solemnly about the challenges still facing the country: unemployment, poverty, emigration.
In a few weeks, millions of college students will enter the real world with dreams of finding work that's meaningful and challenging—and preferably lucrative enough to live roommate-free in a major city. As they embark on their job searches, recent graduates are frequently given the vague advice to "go out and network."
But what exactly should this networking entail? What does one say to a perfect stranger whom one has cajoled into "grabbing coffee," while also telepathically conveying one's desire for a job?
Science has one piece of advice, which is this: Ask them for advice.
Far from inconveniencing or annoying the advice-giver, research shows that asking for advice appears to boost perceptions of intelligence.
When healthcare is at its best, hospitals are four-star hotels, and nurses, personal butlers at the ready—at least, that’s how many hospitals seem to interpret a government mandate.
When Department of Health and Human Services administrators decided to base 30 percent of hospitals’ Medicare reimbursement on patient satisfaction survey scores, they likely figured that transparency and accountability would improve healthcare. The Centers for Medicare and Medicaid Services (CMS) officials wrote, rather reasonably, “Delivery of high-quality, patient-centered care requires us to carefully consider the patient’s experience in the hospital inpatient setting.” They probably had no idea that their methods could end up indirectly harming patients.