A study finds that the sexes interpret the world differently, with men more likely to judge it in black-and-white terms
It has long been asserted--at least in by those inclined to stereotype—that women are more complex than men. But according to a new research study, women may see the world in more complex ways, as well.
In a study scheduled for publication in the Archives of Sexual Behavior, three researchers from the University of Warwick in England asked a group of men and women to categorize natural and manufactured objects as being "part of," "not part of," or "somewhat part of" a particular category. All of the object/category pairs in the study were selected because they defied easy categorization (e.g. is a tomato a fruit? Is billiards a sport? Is a computer a tool?). Nonetheless, the male subjects were far more likely to assert that the objects were completely in or out of a particular category. The women, on the other hand, were more likely to reject absolute answers in favor of the "somewhat" (or "it's not that simple") option.
Lest anyone take the results as an indication of indecision or unwillingness on the part of the women to take a stand on anything, the researchers also tested to see how confident each participant was about his or her categorization. Interestingly, the participants who were most confident in general chose "somewhat part of" as an answer less often than the others. But there was no difference between the sexes in their levels of confidence about their choices. The women were just as absolutely sure the answers were complex as the men were sure they were simple.
Granted, the study sample size was small: only 113 subjects. But still. What do we make of the possibility that men may, as a group, categorize the world in more black-and-white terms, while women see it in more shades of gray? What accounts for that difference? Dr. Zachary Estes, one of the study's authors, isn't sure.
"To speculate a bit, this sex difference is almost certainly a combination of biological predisposition and social environment," he said. "[But] whether the male tendency for absolute judgments is related to assertion, or simplicity, or anything else like that, we simply don't know yet."
In terms of socialization, it's true that our society (and, indeed, many societies) judges men in terms of their competence—which implies, or requires, clear and confident knowledge about subjects. Men are also judged in terms of their ability to command, which requires assertive judgment calls. So given the same set of ambiguous calls to make, it's not surprising that men lean toward more absolute judgments.
Having to maintain a command attitude also influences how a person pursues or processes information. As I've written elsewhere, a commander has a very different agenda and approach than, say, an "explorer." Explorers don't seek to control the world around them. They seek, instead, to understand it. As a result, explorers take the information available to them as a starting point, seeking ever more information that might clarify or expand their understanding. They also have to be comfortable with ambiguity, since the world of the explorer is one that remains largely unknown. The challenge of commanders is very different. Their task is to take whatever information is available in any given moment and winnow it down to a clear, unambiguous decision point.
How does this relate to the research of Estes and his colleagues? Because women may feel less pressure to command, and more freedom to explore, than men do—leaving them more open to seeing or accepting shades of gray.
Of course, there might also turn out to be a biological or neurological component that explains the difference, similar to the brain differences I wrote about recently between people who call themselves conservative vs. liberal. Or perhaps women are more inclined to stay a bit neutral in their judgments for social or psychological reasons. Learning to couch their opinions a bit might help women build a wider social circle or avoid harsh recriminations from bigger, stronger, and more powerful members of the opposite sex.
But whatever the roots of Estes's findings, their implications are intriguing to consider. A former boss of mine once said that he thought the real division between people's world views wasn't conservative vs. liberal. It was between people who saw the world in black-and-white terms and those who saw it, instead, in complex shades of gray.
"The more people see the world in black-and-white terms," he said, "regardless of whether they're on the right or the left, the harder it is for them to change their views on anything. There are only two options for them, and the distance to the other possible viewpoint is too far. People who see the world in shades of gray, on the other hand, can adjust their views more easily, if they get new or conflicting information, because all they have to do is shift to a slightly lighter or darker shade."
So does that mean women are more likely to alter their opinions if presented with new information? It's an interesting possibility that has implications for the boardroom as well as the voting booth.
"Successful" CEOs have traditionally been seen as strong, decisive leaders who take charge—very much the commander role. But in a fast-changing, complex and global market, adapting quickly to change and fostering creative innovation are increasingly important survival skills for companies to master. And those strengths often come more naturally to people who are more comfortable with ambiguity and who see the world, or at least CAN see the world, from multiple viewpoints, or in multiple shades of gray.
Estes says that if he conducted his research among a group of men and women in an executive boardroom, the results might show very little difference in the inclination of men and women to make absolute judgments, because "that might be precisely why [the women] are accepted into an executive role in the first place." But ironically, it might be that very difference, and that willingness to see the world in complex shades of gray, that could give women an edge in leading the companies of the future. Image: igor.gribanov/flickr
At least they didn’t go for the giant, spring-loaded needle traps.
The events that led teams of helicopter-borne vets to pelt the Swiss countryside with vaccine-impregnated chicken heads began in 1939. Two things were then sweeping through Poland: the Nazis, and an epidemic of rabies carried by red foxes. Every year, the wavefront of disease advanced southward and westward by several dozen kilometers, hitting country after country. In March of 1967, it reached Switzerland.
The epidemic was a huge problem. Rabies is caused by a virus that spreads through the bites of animals and targets the brain. Unless infected people get a (really expensive) vaccine right away, the disease is almost always fatal. So, something had to be done about the foxes. The usual methods—poisoning, trapping, gassing, and shooting—weren’t working. The alternative was to vaccinate them.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
A pastor and a rabbi talk about kids, poop, and tearing down the patriarchy in institutional religion.
The Bible is a man’s book. It was mostly written by men, for men, and about men. The people who then interpreted the text have also been predominately male.
No wonder there’s not much theology preoccupied with weird-colored poop and the best way to weather tantrums. Throughout history, childcare has largely been considered women’s work—and, by extension, not theologically serious.
Danya Ruttenberg—a Conservative rabbi whose book about parenting came out in April—disagrees. So does Bromleigh McCleneghan, a Chicago-area pastor and the author of a 2012 book about parenting and a forthcoming book about Christians and sex. Both women have made their careers in writing and ministry. But they’re also both moms, and they believe the work they do as parents doesn’t have to remain separate from the work they do as theologians.
The long-running cartoon’s representation of Judaism was one of the first on television.
Growing up in south London, and then in the largely Catholic town of Manhasset on Long Island, I didn’t encounter many families who looked, sounded, or behaved like mine. In England, my experiences were limited to either my mother’s family, who were all Orthodox Jews, strictly observing the Sabbath and keeping kosher, and to the families of my classmates, who were invariably all gentiles. In Manhasset, I didn’t even have the Orthodox to relate to. So one of my main comforts in both places came from the Pickles family, who—with its big-haired, neurotic, doting mother and its old-world, Yiddish-mumbling grandparents—instantly made me feel at home. It also helped that I could spend time with the Pickles family whenever I wanted; after all, they were on TV.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Two scholars discuss the ups and downs of life as a right-leaning professor.
“I don’t think I can say it too strongly, but literally it just changed my life,” said a scholar, about reading the work of Ayn Rand. “It was like this awakening for me.”
Different versions of this comment appear throughout Jon A. Shields and Joshua M. Dunn Sr.’s book on conservative professors, Passing on the Right, usually about people like Milton Friedman and John Stuart Mill and Friedrich Hayek. The scholars they interviewed speak in a dreamy way about these nerdy celebrities, perhaps imagining an alternate academic universe—one where social scientists can be freely conservative.
The assumption that most college campuses lean left is so widespread in American culture that it has almost become a caricature: intellectuals in thick-rimmed glasses preaching Marxism on idyllic grassy quads; students protesting minor infractions against political correctness; raging professors trying to prove that God is, in fact, dead. Studies about professors’ political beliefs and voting behavior suggest this assumption is at least somewhat correct. But Shields and Dunn set out to investigate a more nuanced question: For the minority of professors who are cultural and political conservatives, what’s life actually like?
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
Readers respond to the question with dramatic personal stories.
Readers respond to the question with dramatic personal stories and the lessons they learned. To submit your own breakup story, email firstname.lastname@example.org. (And if you’d like to include a song that most resonates with that relationship, please do.)