A study finds that the sexes interpret the world differently, with men more likely to judge it in black-and-white terms
It has long been asserted--at least in by those inclined to stereotype—that women are more complex than men. But according to a new research study, women may see the world in more complex ways, as well.
In a study scheduled for publication in the Archives of Sexual Behavior, three researchers from the University of Warwick in England asked a group of men and women to categorize natural and manufactured objects as being "part of," "not part of," or "somewhat part of" a particular category. All of the object/category pairs in the study were selected because they defied easy categorization (e.g. is a tomato a fruit? Is billiards a sport? Is a computer a tool?). Nonetheless, the male subjects were far more likely to assert that the objects were completely in or out of a particular category. The women, on the other hand, were more likely to reject absolute answers in favor of the "somewhat" (or "it's not that simple") option.
Lest anyone take the results as an indication of indecision or unwillingness on the part of the women to take a stand on anything, the researchers also tested to see how confident each participant was about his or her categorization. Interestingly, the participants who were most confident in general chose "somewhat part of" as an answer less often than the others. But there was no difference between the sexes in their levels of confidence about their choices. The women were just as absolutely sure the answers were complex as the men were sure they were simple.
Granted, the study sample size was small: only 113 subjects. But still. What do we make of the possibility that men may, as a group, categorize the world in more black-and-white terms, while women see it in more shades of gray? What accounts for that difference? Dr. Zachary Estes, one of the study's authors, isn't sure.
"To speculate a bit, this sex difference is almost certainly a combination of biological predisposition and social environment," he said. "[But] whether the male tendency for absolute judgments is related to assertion, or simplicity, or anything else like that, we simply don't know yet."
In terms of socialization, it's true that our society (and, indeed, many societies) judges men in terms of their competence—which implies, or requires, clear and confident knowledge about subjects. Men are also judged in terms of their ability to command, which requires assertive judgment calls. So given the same set of ambiguous calls to make, it's not surprising that men lean toward more absolute judgments.
Having to maintain a command attitude also influences how a person pursues or processes information. As I've written elsewhere, a commander has a very different agenda and approach than, say, an "explorer." Explorers don't seek to control the world around them. They seek, instead, to understand it. As a result, explorers take the information available to them as a starting point, seeking ever more information that might clarify or expand their understanding. They also have to be comfortable with ambiguity, since the world of the explorer is one that remains largely unknown. The challenge of commanders is very different. Their task is to take whatever information is available in any given moment and winnow it down to a clear, unambiguous decision point.
How does this relate to the research of Estes and his colleagues? Because women may feel less pressure to command, and more freedom to explore, than men do—leaving them more open to seeing or accepting shades of gray.
Of course, there might also turn out to be a biological or neurological component that explains the difference, similar to the brain differences I wrote about recently between people who call themselves conservative vs. liberal. Or perhaps women are more inclined to stay a bit neutral in their judgments for social or psychological reasons. Learning to couch their opinions a bit might help women build a wider social circle or avoid harsh recriminations from bigger, stronger, and more powerful members of the opposite sex.
But whatever the roots of Estes's findings, their implications are intriguing to consider. A former boss of mine once said that he thought the real division between people's world views wasn't conservative vs. liberal. It was between people who saw the world in black-and-white terms and those who saw it, instead, in complex shades of gray.
"The more people see the world in black-and-white terms," he said, "regardless of whether they're on the right or the left, the harder it is for them to change their views on anything. There are only two options for them, and the distance to the other possible viewpoint is too far. People who see the world in shades of gray, on the other hand, can adjust their views more easily, if they get new or conflicting information, because all they have to do is shift to a slightly lighter or darker shade."
So does that mean women are more likely to alter their opinions if presented with new information? It's an interesting possibility that has implications for the boardroom as well as the voting booth.
"Successful" CEOs have traditionally been seen as strong, decisive leaders who take charge—very much the commander role. But in a fast-changing, complex and global market, adapting quickly to change and fostering creative innovation are increasingly important survival skills for companies to master. And those strengths often come more naturally to people who are more comfortable with ambiguity and who see the world, or at least CAN see the world, from multiple viewpoints, or in multiple shades of gray.
Estes says that if he conducted his research among a group of men and women in an executive boardroom, the results might show very little difference in the inclination of men and women to make absolute judgments, because "that might be precisely why [the women] are accepted into an executive role in the first place." But ironically, it might be that very difference, and that willingness to see the world in complex shades of gray, that could give women an edge in leading the companies of the future. Image: igor.gribanov/flickr
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The Republican frontrunner has surged in the polls by taking a tough stance on immigration—and if critics want to stop him, that’s what they need to attack.
A new round of attack ads are heading Donald Trump’s way, some from John Kasich’s campaign and the super PAC backing him, and more in the future from an LLC created specifically to produce anti-Trump messages.
New Day for America’s 47-second ad splices together some of the Republican front-runner’s most awkward video moments: his suggestion he might date his daughter, his claim of “a great relationship with the blacks.” The Kasich campaign’s ad turns Martin Niemöller’s famous words “nobody left to speak for me” into a warning from one of John McCain’s fellow Hanoi Hilton POWs that a Trump presidency is a threat to freedom.* John Kasich’s Twitter account has fired direct personal challenges to the famously thin-skinned mogul.
It may not start a new war. But it will make it much harder to stop an old one.
For clues to how the Syrian Civil War might finally end—or devolve into an even more nightmarish conflict—look to the congested skies over Syria.
There, the air forces of countries such as the United States, Russia, Turkey, and Syria are all regularly conducting strikes, often at cross-purposes. And there, on Tuesday, Turkish fighter jets shot down a Russian warplane for allegedly violating Turkey’s airspace. As my colleague Marina Koren notes, the episode marks the first time a NATO country has downed a Russian plane in 63 years.
An entire industry has been built on the premise that creating gourmet meals at home is simple and effortless. But it isn’t true.
I write about food for a living. Because of this, I spend more time than the average American surrounded by cooking advice and recipes. I’m also a mother, which means more often than not, when I return from work 15 minutes before bedtime, I end up feeding my 1-year-old son squares of peanut-butter toast because there was nothing in the fridge capable of being transformed into a wholesome, homemade toddler meal in a matter of minutes. Every day, when I head to my office after a nourishing breakfast of smashed blueberries or oatmeal I found stuck to the pan, and open a glossy new cookbook, check my RSS feed, or page through a stack of magazines, I’m confronted by an impenetrable wall of unimaginable cooking projects, just sitting there pretending to be totally reasonable meals. Homemade beef barbacoa tacos. Short-rib potpie. “Weekday” French toast. Make-ahead coconut cake. They might as well be skyscraper blueprints, so improbable is the possibility that I will begin making my own nut butters, baking my own sandwich bread, or turning that fall farmer’s market bounty into jars of homemade applesauce.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
When the birds were reintroduced to New England after a long absence, they chose to live in cities instead of the forests they once called home.
William Bradford, looking out at Plymouth from the Mayflower in 1620, was struck by its potential. “This bay is an excellent place,” he later wrote, praising its “innumerable store of fowl.” By the next autumn, the new colonists had learned to harvest the “great store of wild turkeys, of which they took many.”
Soon, they took too many. By 1672, hunters in Massachusetts had “destroyed the breed, so that ‘tis very rare to meet with a wild turkie in the woods.” Turkeys held on in small, isolated patches of land that could not be profitably farmed. But by 1813, they were apparently extirpated from Connecticut; by 1842 from Vermont; and from New York in 1844.
In Massachusetts—land of the Pilgrim’s pride—one tenacious flock hid out on the aptly-named Mount Tom for a while longer. The last bird known to science was shot, stuffed, mounted, and put on display at Yale in 1847, but locals swore they heard the distinctive calls of the toms for another decade. Then the woods fell silent for a hundred years.
Some conservatives are defying expectation and backing the Vermont senator.
When Tarie MacMillan switched on her television in August to watch the first Republican presidential debate, she expected to decide which candidate to support.
But MacMillan, a 65-year-old Florida resident, was disappointed. “I looked at the stage and there was nobody out there who I really liked. It just seemed like a showcase for Trump and his ridiculous comments,” she recalled. “It was laughable, and scary, and a real turning point.”
So she decided to back Bernie Sanders, the self-described “Democratic socialist” challenging Hillary Clinton. MacMillan was a lifelong Republican voter until a few weeks ago when she switched her party affiliation to support the Vermont senator in the primary. It will be the first time she’s ever voted for a Democrat.
If you want to annoy a scientist, say that science isn’t so different from religion. When Ben Carson was challenged about his claim that Darwin was encouraged by the devil, he replied, “I’m not going to denigrate you because of your faith, and you shouldn’t denigrate me for mine.” When the literary theorist Stanley Fish chastised atheists such as Richard Dawkins, he wrote, “Science requires faith too before it can have reasons,” and described those who don't accept evolution as belonging to “a different faith community.”
Scientists are annoyed by these statements because they suggest that science and religion share a certain epistemological status. And, indeed, many humanists and theologians insist that there are multiple ways of knowing, and that religious narratives exist alongside scientific ones, and can even supersede them.
Would body cameras have made justice speedier for Laquan McDonald? Not without new laws.
On October 20, 2014, a white Chicago Police Department officer named Jason Van Dyke fired 16 shots at a black Chicago resident, a 17-year-old named Laquan McDonald. Van Dyke started shooting McDonald while he had his back turned on the officer, and he continued firing after McDonald had fallen to the ground.
McDonald was not carrying a gun at the time, though the city says he was holding a three-inch knife. He died later that night.
On November 24, 2015, the city of Chicago released a video of the killing, captured by a police-car dashboard camera. Hours before releasing the video, Cook County also charged Van Dyke with first-degree murder—the first Chicago cop to be booked for that crime in almost 35 years.