Every strength has a flip side, as my mother always says. The same communication trait that makes it easy for me to write volumes of words also means that, at times, I talk an awful lot. Someone driven to excel may also drive everyone around them nuts with their singular focus. A tendency to take bold risks can lead to astounding success ... or reckless disaster. And according to a new study published in the journal Psychological Science, that interconnected relationship between strength and weakness may exist in the field of creativity, as well--in the rather scary form of an actual genetic link between high levels of creativity and mental illness.
The idea that highly creative people have more than their share of depression, alcoholism, and other psychological issues or struggles is not new, and anecdotal examples are legion. Van Gogh cut off his ear and suffered depressing visions before finally committing suicide. The writer David Foster Wallace (who gave such a sharp, witty, irreverent and highly memorable commencement address to Kenyon College graduates in 2005 that the Wall Street Journal even saw fit to reprint it) committed suicide last year at the age of 46. Virginia Woolf, Sylvia Plath, Ernest Hemingway, and scores of other writers, artists, and creative individuals have also taken their own lives. And that doesn't even get into the much larger group who created wonderful works of art and brilliance even as they battled serious and debilitating depression or other problems.
There are also numerous examples of more technically-inclined geniuses who have struggled with demons of madness. A new graphic novel/comic book called Logicomix delves into the world of the real-life mathematicians who relentlessly pursued a quest for logical certainty in mathematics throughout the 20th century. (A New York Times review of it can be found here.) One of the book's themes, aside from the pursuit of logical perfection, is the mathematicians' struggles to ward off mental illness. One of the logicians, Bertrand Russell, apparently claimed that it was only his love of mathematics that saved him from suicide--although two of his children developed schizophrenia and killed themselves. Another logician, Georg Cantor, died in an insane asylum, and a third, Kurt Godel, became so paranoid about being poisoned that he starved himself to death.
What causes these brilliant, creative minds to fall into such dark places? Does obsession with an idea--a common trait in those driven to pursue its exploration and expression, whether in words or formulas--somehow disconnect us with an important perspective or grounding that a more balanced focus provides? Or are brilliantly artistic or creative people actually predisposed to mental illness?
Possibly the latter, according to just-published research conducted by Hungarian psychiatrist Szabolcs Keri. (You can access the Psychological Science article here, although there's a charge to view it.) In order to explore a possible genetic link between creativity and psychosis, Keri focused his research on the T/T variant of the Neuregulin 1 gene. Neuregulin 1 plays a role in a variety of brain processes, including development and strengthening communication between neurons. But the T/T variant of the gene has also been associated with a greater risk for schizophrenia and bipolar disorder.
Keri's research study was admittedly limited. He interviewed 128 study participants, all of whom had "high intellectual and academic performance." The group was divided by genotypes (variants) into three groups: T/T, C/T, and C/C. Keri found no difference in the groups on the basis of gender or IQ. But he found a distinct difference when it came to scores on creativity tests. The T/T group scored significantly higher in terms of creativity; almost twice as high as the C/C group in some categories.
Why would the T/T group score so much higher on creativity? It may be that the "reduced cognitive inhibition" associated with that variant allows for more creative mental wanderings in more ways than one. A terrific imagination can also lead to terrific nightmares. But what I found particularly interesting was Keri's thought on why the species would retain a gene variant that caused such big problems. According to Darwin, after all, a gene variant that led to debilitating disorders should die out. And yet, the T/T gene variant persists.
"Why are genetic polymorphisms related to severe mental disorders retained in the gene pool of a population?" Keri asked. "A possible answer is that these genetic variations may have a positive impact on psychological function."
The sword, in other words, might have two sides. Creativity is good for advancing the species, even if it sometimes leads to madness. That kind of evolutionary trade-off also doesn't seem to be unique to the neuregulin 1 gene. Research published this past June by John McDonald, chair of the Biology department at Georgia Tech and chief research scientist at the Ovarian Cancer Institute, raised the possibility that the same characteristic that allowed human brains to develop so much bigger and faster than other primates may also be the reason human cells are more susceptible to cancer.
"The results from our analysis suggest that humans aren't as efficient as chimpanzees in carrying out programmed cell death. We believe this difference may have evolved as a way to increase brain size and associated cognitive ability in humans, but the cost could be an increased propensity for cancer," McDonald was quoted as saying.
In a ideal world, the strengths could be separated from the weaknesses, and a perfect species could evolve. But the same law of unintended consequences that plagues so many advances we make, from increased longevity leading to overpopulation problems and antibiotics creating super-resistant bacteria to computer-controlled systems becoming more vulnerable to viruses and hackers ... may be just a continuation of a dichotomy that's been playing out in our DNA for centuries. Our strengths create potential vulnerabilities. There is a dark side to The Force.
A military person would call this phenomenon a "reverse salient." A practictioner of Taoism would say it's the balance of yin and yang. My mother would simply say it's the way of the world. But if these researchers' hypotheses are correct, it means that growth and creativity are important enough to the species that nature has decided they're worth even the ravages of cancer and mental illness to preserve. And that, itself, is a thought worth pondering.
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
“If the office is going to become a collection of employees not working together, it essentially becomes no different than a coffee shop.”
There’s plenty of research out there on the benefits of remote and flexible work. It’s been shown to lead to increased productivity, and has an undeniable benefit for work-life balance. But what does it do to everyone back at the office?
In a 2013 memo to workers explaining why the company was eliminating policies that allowed remote work, Jackie Reses, Yahoo’s head of human resources,argued that some of the “best decisions and insights come from hallway and cafeteria discussion,” and that actual presence in the office encourages better collaboration and communication.
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
What will happen to digital collections of books, movies, and music when the tech giants fall?
When you purchase a movie from Amazon Instant Video, you’re not buying it, exactly. It’s more like renting indefinitely.
This distinction matters if your notion of “buying” is that you pay for something once and then you get to keep that thing for as long as you want. Increasingly, in the world of digital goods, a purchasing transaction isn’t that simple.
There are two key differences between buying media in a physical format versus a digital one. First, there’s the technical aspect: Maintaining long-term access to a file requires a hard copy of it—that means, for example, downloading a film, not just streaming from a third party’s server. The second distinction is a bit more complicated, and it has to do with how the law has shaped digital rights in the past 15 years. It helps to think about the experience of a person giving up CDs and using iTunes for music purchases instead.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.