One seemingly obvious, but still notable, aspect of Southern antebellum ladyhood, is the necessary and explicit disqualification of black women. The sphere of Southern ladyhood largely consisted of personal beauty and moral reform, with the first seen as evidence of the second. Personal beauty proved personal morality. In the 19th century white mind, whiteness was an essential component of female beauty, and thus, ladyhood.
From historian Mary Cathryn Cain's article "The Art and Politics of Looking White: Beauty Practices among White Women in Antebellum America"
Antebellum white Americans interpreted
visible whiteness as an outward projection of inner
virtue or, as the Toilette of Health, Beauty and Fashion
maintained, ''the face is the mirror of the soul.'' A
beautiful white face, then, reflected an unstained
heart, and the skin's translucence was no longer
valued solely for its physical beauty: it was valorized as
evidence of moral rectitude that allowed a woman's
inner light to shine for any observer. Likewise, the
Book of Health and Beauty declared that ''a hand white
and smooth, diversified with bluish veins, presenting
to the touch the softness of satin, and to the eye the
grateful color of milk'' could be read as a clear index
of a woman's ''moral accomplishments.''
Analysis of Female Beauty, Wilson Flagg reinforced
the attitude that female whiteness was incompatible
with negative personal traits. The book consisted
of a series of poems, each of which depicted
an ideal woman who bore the physical attributes
associated with a particular feminine virtue. Flagg
describes ''Sylvia,'' the personification of Innocence,
by alluding to ''her complexion's pearly hues,'' while
''Cecilia,'' the embodiment of Constancy, looked ''as
white and spotless as new-drifted snow.'' Perhaps
Flagg's characterization of Piety in ''Ophelia'' is his
most telling: ''You cannot think beneath a brow so
fair, /One sinful thought was ever harbored there.''
Here Flagg explicitly equates whiteness with the absence of sin.
In the Southern antebellum white mind, no black woman could ever qualify as a lady, because whiteness was beauty and beauty was moral cleanliness. But like most of the societal components of white supremacy, as surely as patrolling the boundaries of ladyhood meant keeping blacks locked out, it also meant keeping whites locked in. And so whiteness became not simply a sign of beauty and morality, but a sign of an aristocratic mien. Obviously being white does not, automatically, gift you with skin that is "spotless as new-drifted snow." For such an affect, a healthy industry of powders and cosmetics existed to help affect the illusion of moral cleanliness.
But many such cosmetics were railed against by the white aristocracy as unnatural, and the women who applied them were roundly denounced as "painted ladies." Instead, it was advised that white women find other ways to perfect themselves--like a ingesting white chalk and arsenic:
To achieve the desired complexion,
middle-class white women ritualized the practices
described in beauty manuals--not all of them well
advised. Some women dieted, slept with their
windows open, or abstained from sleep altogether.
Some women swore by warm baths. Others swore
by warm beverages; still others swore off hot drinks
completely. Some women ate chalk, drank vinegar,
wore camphorated charms, bled themselves with
leeches or even ingested arsenic to get the desired
result. Many refrained from drinking alcohol and
reading at night. And almost all middle-class white
women avoided the sun.
African-American women from the South, and perhaps from Detroit, Chicago and Harlem, might find that last bit about avoiding the sun particularly poignant. In another era, it was not at all atypical for black people to advise their children to do exactly that for fear of them moving from "colored" to "black."
Some of this was raised, a few weeks back, while discussing Kanye West's album, and hip-hop's occasional embarrassing reinforcement of aesthetics born of a phrenological age. Ladyhood isn't what it once was. But the notion that lighter skin confers upon the owner some deeper power is very much with us. We like to call it colorism. But this understates things. It's white supremacy. When black rappers exalt the "sexy young ladies of the light skin breed," they are participating in an exercise inaugurated with their arrival to the West in chains. They are patrolling the borders, caging off women for sure, but just as surely, caging off themselves.
Image taken from "The Three Species of Beauty, as affecting the head and face,'' Alexander Walker, Beauty: Illustrated Chiefly by an Analysis and Classification of Beauty in Woman (New York: W. H. Colyer, 1845), pl. 16. As cited in Cain's article.
Ta-Nehisi Coates is a national correspondent at The Atlantic, where he writes about culture, politics, and social issues. He is the author of The Beautiful Struggle and the forthcoming Between the World and Me.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
As the vice president edges toward a presidential run, is he banking on further public disclosures to discredit the frontrunner?
As Joe Biden edges closer to a presidential run, there’s no shortage of theories as to what he’s up to. Former secretary of state Hillary Clinton has built a commanding lead in the national polls, giving Biden little apparent space to gain traction. Perhaps he’s counting on the early-primary state of South Carolina to provide a critical boost. He might be banking on appearing as a stronger general-election candidate than any of his potential rivals in the primary race. Maybe after spending the past 42 years of his life running for elective office, he just can’t stop.
But there’s one intriguing theory that has so far garnered little attention: What if Biden knows something about Democratic frontrunner Hillary Clinton that the rest of us don’t?
The drug modafinil was recently found to enhance cognition in healthy people. Should you take it to get a raise?
If you could take a pill that will make you better at your job, with few or no negative consequences, would you do it?
In a meta-analysis recently published in European Neuropsychopharmacology, researchers from the University of Oxford and Harvard Medical School concluded that a drug called modafinil, which is typically used to treat sleep disorders, is a cognitive enhancer. Essentially, it can help normal people think better.
Out of all cognitive processes, modafinil was found to improve decision-making and planning the most in the 24 studies the authors reviewed. Some of the studies also showed gains in flexible thinking, combining information, or coping with novelty. The drug didn’t seem to influence creativity either way.
In 1998, Toni Morrison wrote a comment for The New Yorker arguing that “white skin notwithstanding, this is our first black President. Blacker than any actual black person who could ever be elected in our children’s lifetime.” Last week the New York Times, implicitly cited Morrison’s piece, and claimed the author was giving Clinton “a compliment.” This interpretation of Morrison’s claim is as common as it is erroneous.
The popular interpretation of Morrison’s point (exhibited here) holds that, summoning all of her powers, the writer gazed into the very essence of Clinton, and found him sufficiently soulful. In fact, Morrison’s point had little to do with soul of any kind. She was not much concerned with Clinton’s knowledge of Ebonics, his style of handshake, nor whether he pledged Alpha or Q. Morrison was concerned with power.
It is not too late to strengthen the Iran deal, a prominent critic says.
It appears likely, as of this writing, that Barack Obama will be victorious in his fight to implement the Iran nuclear deal negotiated by his secretary of state, John Kerry. Republicans in Congress don’t appear to have the votes necessary to void the agreement, and Benjamin Netanyahu’s campaign to subvert Obama may be remembered as one of the more counterproductive and shortsighted acts of an Israeli prime minister since the rebirth of the Jewish state 67 years ago.
Things could change, of course, and the Iranian regime, which is populated in good part by extremists, fundamentalist theocrats, and supporters of terrorism, could do something monumentally stupid in the coming weeks that could force on-the-fence Democrats to side with their Republican adversaries (remember the Café Milano fiasco, anyone?). But, generally speaking, the Obama administration, and its European allies, seem to have a clearer path to implementation than they had at the beginning of the month.
A new study finds an algorithmic word analysis is flawless at determining whether a person will have a psychotic episode.
Although the language of thinking is deliberate—let me think, I have to do some thinking—the actual experience of having thoughts is often passive. Ideas pop up like dandelions; thoughts occur suddenly and escape without warning. People swim in and out of pools of thought in a way that can feel, paradoxically, mindless.
Most of the time, people don’t actively track the way one thought flows into the next. But in psychiatry, much attention is paid to such intricacies of thinking. For instance, disorganized thought, evidenced by disjointed patterns in speech, is considered a hallmark characteristic of schizophrenia. Several studies of at-risk youths have found that doctors are able to guess with impressive accuracy—the best predictive models hover around 79 percent—whether a person will develop psychosis based on tracking that person’s speech patterns in interviews.
84 percent favor requiring police officers to wear body cameras.
74 percent of survey respondents believe the public should have access to footage from those body cameras any time that a police officer stands accused of misconduct. A narrow majority believes that the public should have access to all footage.
As for investigations into misconduct by police officers, 79 percent believe the public should have access to the findings if there has been wrongdoing, and 64 percent believe the public should have that same access anytime a cop is even accused.
A new study shows that the field suffers from a reproducibility problem, but the extent of the issue is still hard to nail down.
No one is entirely clear on how Brian Nosek pulled it off, including Nosek himself. Over the last three years, the psychologist from the University of Virginia persuaded some 270 of his peers to channel their free time into repeating 100 published psychological experiments to see if they could get the same results a second time around. There would be no glory, no empirical eurekas, no breaking of fresh ground. Instead, this initiative—the Reproducibility Project—would be the first big systematic attempt to answer questions that have been vexing psychologists for years, if not decades. What proportion of results in their field are reliable?
A string of questionable police killings demonstrates the need to reevaluate laws that govern the use of lethal force.
On July 1, 2012, Milton Hall, a homeless man with a history of mental illness, stole a cup of coffee from a convenience store in Saginaw, Michigan. The store’s clerk called 911. When an officer arrived, Hall produced a knife with a three-inch blade and threatened her with it. She called for backup and seven other officers soon joined her, one of them with a police dog. They formed an arc around Hall and aimed their firearms—pistols and a rifle—at him. The standoff continued for several minutes, with the officers repeatedly asking Hall to put the knife down and Hall repeatedly refusing. Finally, Hall, still wielding his knife, began to walk toward the police dog and the K9 officer. After he had taken a few steps—three, by my count, as I watch video footage from a patrol car’s dashboard camera and available on YouTube—the officers shot Hall to death in a volley of 47 bullets.